[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 30575 1726867564.41999: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 30575 1726867564.42333: Added group all to inventory 30575 1726867564.42335: Added group ungrouped to inventory 30575 1726867564.42339: Group all now contains ungrouped 30575 1726867564.42342: Examining possible inventory source: /tmp/network-5rw/inventory.yml 30575 1726867564.53281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 30575 1726867564.53326: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 30575 1726867564.53342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 30575 1726867564.53383: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 30575 1726867564.53434: Loaded config def from plugin (inventory/script) 30575 1726867564.53435: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 30575 1726867564.53462: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 30575 1726867564.53521: Loaded config def from plugin (inventory/yaml) 30575 1726867564.53525: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 30575 1726867564.53583: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 30575 1726867564.53860: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 30575 1726867564.53863: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 30575 1726867564.53865: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 30575 1726867564.53869: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 30575 1726867564.53872: Loading data from /tmp/network-5rw/inventory.yml 30575 1726867564.53915: /tmp/network-5rw/inventory.yml was not parsable by auto 30575 1726867564.53961: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 30575 1726867564.53990: Loading data from /tmp/network-5rw/inventory.yml 30575 1726867564.54045: group all already in inventory 30575 1726867564.54051: set inventory_file for managed_node1 30575 1726867564.54053: set inventory_dir for managed_node1 30575 1726867564.54054: Added host managed_node1 to inventory 30575 1726867564.54056: Added host managed_node1 to group all 30575 1726867564.54056: set ansible_host for managed_node1 30575 1726867564.54057: set ansible_ssh_extra_args for managed_node1 30575 1726867564.54059: set inventory_file for managed_node2 30575 1726867564.54060: set inventory_dir for managed_node2 30575 1726867564.54061: Added host managed_node2 to inventory 30575 1726867564.54062: Added host managed_node2 to group all 30575 1726867564.54062: set ansible_host for managed_node2 30575 1726867564.54063: set ansible_ssh_extra_args for managed_node2 30575 1726867564.54064: set inventory_file for managed_node3 30575 1726867564.54066: set inventory_dir for managed_node3 30575 1726867564.54066: Added host managed_node3 to inventory 30575 1726867564.54067: Added host managed_node3 to group all 30575 1726867564.54067: set ansible_host for managed_node3 30575 1726867564.54068: set ansible_ssh_extra_args for managed_node3 30575 1726867564.54070: Reconcile groups and hosts in inventory. 30575 1726867564.54073: Group ungrouped now contains managed_node1 30575 1726867564.54074: Group ungrouped now contains managed_node2 30575 1726867564.54075: Group ungrouped now contains managed_node3 30575 1726867564.54128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 30575 1726867564.54207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 30575 1726867564.54238: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 30575 1726867564.54257: Loaded config def from plugin (vars/host_group_vars) 30575 1726867564.54259: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 30575 1726867564.54264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 30575 1726867564.54270: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 30575 1726867564.54299: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 30575 1726867564.54529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867564.54599: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 30575 1726867564.54623: Loaded config def from plugin (connection/local) 30575 1726867564.54625: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 30575 1726867564.54993: Loaded config def from plugin (connection/paramiko_ssh) 30575 1726867564.54996: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 30575 1726867564.55541: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30575 1726867564.55567: Loaded config def from plugin (connection/psrp) 30575 1726867564.55569: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 30575 1726867564.55968: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30575 1726867564.55993: Loaded config def from plugin (connection/ssh) 30575 1726867564.55996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 30575 1726867564.57222: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30575 1726867564.57245: Loaded config def from plugin (connection/winrm) 30575 1726867564.57247: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 30575 1726867564.57267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 30575 1726867564.57313: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 30575 1726867564.57352: Loaded config def from plugin (shell/cmd) 30575 1726867564.57353: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 30575 1726867564.57369: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 30575 1726867564.57407: Loaded config def from plugin (shell/powershell) 30575 1726867564.57409: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 30575 1726867564.57447: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 30575 1726867564.57550: Loaded config def from plugin (shell/sh) 30575 1726867564.57552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 30575 1726867564.57573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 30575 1726867564.57647: Loaded config def from plugin (become/runas) 30575 1726867564.57648: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 30575 1726867564.57757: Loaded config def from plugin (become/su) 30575 1726867564.57759: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 30575 1726867564.57852: Loaded config def from plugin (become/sudo) 30575 1726867564.57854: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 30575 1726867564.57875: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30575 1726867564.58086: in VariableManager get_vars() 30575 1726867564.58100: done with get_vars() 30575 1726867564.58186: trying /usr/local/lib/python3.12/site-packages/ansible/modules 30575 1726867564.60091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 30575 1726867564.60163: in VariableManager get_vars() 30575 1726867564.60167: done with get_vars() 30575 1726867564.60169: variable 'playbook_dir' from source: magic vars 30575 1726867564.60169: variable 'ansible_playbook_python' from source: magic vars 30575 1726867564.60170: variable 'ansible_config_file' from source: magic vars 30575 1726867564.60170: variable 'groups' from source: magic vars 30575 1726867564.60171: variable 'omit' from source: magic vars 30575 1726867564.60171: variable 'ansible_version' from source: magic vars 30575 1726867564.60172: variable 'ansible_check_mode' from source: magic vars 30575 1726867564.60172: variable 'ansible_diff_mode' from source: magic vars 30575 1726867564.60173: variable 'ansible_forks' from source: magic vars 30575 1726867564.60173: variable 'ansible_inventory_sources' from source: magic vars 30575 1726867564.60173: variable 'ansible_skip_tags' from source: magic vars 30575 1726867564.60174: variable 'ansible_limit' from source: magic vars 30575 1726867564.60174: variable 'ansible_run_tags' from source: magic vars 30575 1726867564.60175: variable 'ansible_verbosity' from source: magic vars 30575 1726867564.60200: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 30575 1726867564.60587: in VariableManager get_vars() 30575 1726867564.60598: done with get_vars() 30575 1726867564.60630: in VariableManager get_vars() 30575 1726867564.60639: done with get_vars() 30575 1726867564.60668: in VariableManager get_vars() 30575 1726867564.60675: done with get_vars() 30575 1726867564.60709: in VariableManager get_vars() 30575 1726867564.60717: done with get_vars() 30575 1726867564.60749: in VariableManager get_vars() 30575 1726867564.60757: done with get_vars() 30575 1726867564.60788: in VariableManager get_vars() 30575 1726867564.60796: done with get_vars() 30575 1726867564.60834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 30575 1726867564.60842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 30575 1726867564.61001: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 30575 1726867564.61102: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 30575 1726867564.61104: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 30575 1726867564.61129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 30575 1726867564.61145: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 30575 1726867564.61242: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 30575 1726867564.61280: Loaded config def from plugin (callback/default) 30575 1726867564.61282: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30575 1726867564.62070: Loaded config def from plugin (callback/junit) 30575 1726867564.62072: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30575 1726867564.62103: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 30575 1726867564.62144: Loaded config def from plugin (callback/minimal) 30575 1726867564.62146: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30575 1726867564.62171: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30575 1726867564.62208: Loaded config def from plugin (callback/tree) 30575 1726867564.62210: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 30575 1726867564.62285: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 30575 1726867564.62287: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30575 1726867564.62308: in VariableManager get_vars() 30575 1726867564.62316: done with get_vars() 30575 1726867564.62320: in VariableManager get_vars() 30575 1726867564.62329: done with get_vars() 30575 1726867564.62332: variable 'omit' from source: magic vars 30575 1726867564.62355: in VariableManager get_vars() 30575 1726867564.62364: done with get_vars() 30575 1726867564.62378: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 30575 1726867564.63894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 30575 1726867564.63945: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 30575 1726867564.63974: getting the remaining hosts for this loop 30575 1726867564.63975: done getting the remaining hosts for this loop 30575 1726867564.63978: getting the next task for host managed_node3 30575 1726867564.63981: done getting next task for host managed_node3 30575 1726867564.63982: ^ task is: TASK: Gathering Facts 30575 1726867564.63983: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867564.63985: getting variables 30575 1726867564.63985: in VariableManager get_vars() 30575 1726867564.63993: Calling all_inventory to load vars for managed_node3 30575 1726867564.63994: Calling groups_inventory to load vars for managed_node3 30575 1726867564.63996: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867564.64004: Calling all_plugins_play to load vars for managed_node3 30575 1726867564.64010: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867564.64012: Calling groups_plugins_play to load vars for managed_node3 30575 1726867564.64033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867564.64064: done with get_vars() 30575 1726867564.64073: done getting variables 30575 1726867564.64117: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Friday 20 September 2024 17:26:04 -0400 (0:00:00.018) 0:00:00.018 ****** 30575 1726867564.64133: entering _queue_task() for managed_node3/gather_facts 30575 1726867564.64134: Creating lock for gather_facts 30575 1726867564.64400: worker is 1 (out of 1 available) 30575 1726867564.64412: exiting _queue_task() for managed_node3/gather_facts 30575 1726867564.64424: done queuing things up, now waiting for results queue to drain 30575 1726867564.64426: waiting for pending results... 30575 1726867564.64559: running TaskExecutor() for managed_node3/TASK: Gathering Facts 30575 1726867564.64610: in run() - task 0affcac9-a3a5-e081-a588-00000000001b 30575 1726867564.64621: variable 'ansible_search_path' from source: unknown 30575 1726867564.64651: calling self._execute() 30575 1726867564.64727: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867564.64734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867564.64741: variable 'omit' from source: magic vars 30575 1726867564.64809: variable 'omit' from source: magic vars 30575 1726867564.64831: variable 'omit' from source: magic vars 30575 1726867564.64855: variable 'omit' from source: magic vars 30575 1726867564.64889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867564.64916: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867564.64932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867564.64946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867564.64956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867564.64979: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867564.64982: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867564.64986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867564.65055: Set connection var ansible_pipelining to False 30575 1726867564.65058: Set connection var ansible_shell_type to sh 30575 1726867564.65063: Set connection var ansible_shell_executable to /bin/sh 30575 1726867564.65068: Set connection var ansible_timeout to 10 30575 1726867564.65073: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867564.65080: Set connection var ansible_connection to ssh 30575 1726867564.65097: variable 'ansible_shell_executable' from source: unknown 30575 1726867564.65100: variable 'ansible_connection' from source: unknown 30575 1726867564.65102: variable 'ansible_module_compression' from source: unknown 30575 1726867564.65105: variable 'ansible_shell_type' from source: unknown 30575 1726867564.65107: variable 'ansible_shell_executable' from source: unknown 30575 1726867564.65111: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867564.65113: variable 'ansible_pipelining' from source: unknown 30575 1726867564.65115: variable 'ansible_timeout' from source: unknown 30575 1726867564.65125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867564.65247: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 30575 1726867564.65254: variable 'omit' from source: magic vars 30575 1726867564.65259: starting attempt loop 30575 1726867564.65262: running the handler 30575 1726867564.65275: variable 'ansible_facts' from source: unknown 30575 1726867564.65290: _low_level_execute_command(): starting 30575 1726867564.65297: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867564.65815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867564.65819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867564.65822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867564.65828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867564.65857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867564.65870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867564.65933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867564.67633: stdout chunk (state=3): >>>/root <<< 30575 1726867564.67731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867564.67761: stderr chunk (state=3): >>><<< 30575 1726867564.67765: stdout chunk (state=3): >>><<< 30575 1726867564.67784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867564.67795: _low_level_execute_command(): starting 30575 1726867564.67800: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070 `" && echo ansible-tmp-1726867564.6778486-30587-209955700405070="` echo /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070 `" ) && sleep 0' 30575 1726867564.68231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867564.68235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867564.68237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867564.68239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867564.68248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867564.68312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867564.68352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867564.70283: stdout chunk (state=3): >>>ansible-tmp-1726867564.6778486-30587-209955700405070=/root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070 <<< 30575 1726867564.70367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867564.70396: stderr chunk (state=3): >>><<< 30575 1726867564.70399: stdout chunk (state=3): >>><<< 30575 1726867564.70418: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867564.6778486-30587-209955700405070=/root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867564.70453: variable 'ansible_module_compression' from source: unknown 30575 1726867564.70496: ANSIBALLZ: Using generic lock for ansible.legacy.setup 30575 1726867564.70499: ANSIBALLZ: Acquiring lock 30575 1726867564.70502: ANSIBALLZ: Lock acquired: 140240646918832 30575 1726867564.70504: ANSIBALLZ: Creating module 30575 1726867564.92662: ANSIBALLZ: Writing module into payload 30575 1726867564.92757: ANSIBALLZ: Writing module 30575 1726867564.92773: ANSIBALLZ: Renaming module 30575 1726867564.92780: ANSIBALLZ: Done creating module 30575 1726867564.92813: variable 'ansible_facts' from source: unknown 30575 1726867564.92819: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867564.92827: _low_level_execute_command(): starting 30575 1726867564.92833: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 30575 1726867564.93246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867564.93249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867564.93252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867564.93254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867564.93256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867564.93302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867564.93305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867564.93311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867564.93363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867564.95055: stdout chunk (state=3): >>>PLATFORM <<< 30575 1726867564.95139: stdout chunk (state=3): >>>Linux <<< 30575 1726867564.95159: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 30575 1726867564.95167: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 30575 1726867564.95298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867564.95317: stderr chunk (state=3): >>><<< 30575 1726867564.95320: stdout chunk (state=3): >>><<< 30575 1726867564.95333: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867564.95342 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 30575 1726867564.95375: _low_level_execute_command(): starting 30575 1726867564.95381: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 30575 1726867564.95454: Sending initial data 30575 1726867564.95457: Sent initial data (1181 bytes) 30575 1726867564.95781: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867564.95784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867564.95787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867564.95789: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867564.95795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867564.95835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867564.95845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867564.95899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867564.99328: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 30575 1726867564.99704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867564.99786: stderr chunk (state=3): >>><<< 30575 1726867564.99790: stdout chunk (state=3): >>><<< 30575 1726867564.99792: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867564.99849: variable 'ansible_facts' from source: unknown 30575 1726867564.99852: variable 'ansible_facts' from source: unknown 30575 1726867564.99854: variable 'ansible_module_compression' from source: unknown 30575 1726867564.99969: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30575 1726867564.99972: variable 'ansible_facts' from source: unknown 30575 1726867565.00092: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/AnsiballZ_setup.py 30575 1726867565.00227: Sending initial data 30575 1726867565.00231: Sent initial data (154 bytes) 30575 1726867565.00779: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.00783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.00785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867565.00787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.00839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867565.00846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867565.00848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867565.00895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867565.02470: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867565.02517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867565.02557: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpclh7g9am /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/AnsiballZ_setup.py <<< 30575 1726867565.02570: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/AnsiballZ_setup.py" <<< 30575 1726867565.02612: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpclh7g9am" to remote "/root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/AnsiballZ_setup.py" <<< 30575 1726867565.03736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867565.03768: stderr chunk (state=3): >>><<< 30575 1726867565.03771: stdout chunk (state=3): >>><<< 30575 1726867565.03789: done transferring module to remote 30575 1726867565.03799: _low_level_execute_command(): starting 30575 1726867565.03802: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/ /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/AnsiballZ_setup.py && sleep 0' 30575 1726867565.04205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867565.04208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867565.04215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867565.04217: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.04219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.04264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867565.04267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867565.04313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867565.06058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867565.06078: stderr chunk (state=3): >>><<< 30575 1726867565.06082: stdout chunk (state=3): >>><<< 30575 1726867565.06095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867565.06098: _low_level_execute_command(): starting 30575 1726867565.06100: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/AnsiballZ_setup.py && sleep 0' 30575 1726867565.06495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.06498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.06501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.06502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.06548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867565.06552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867565.06603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867565.08703: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30575 1726867565.08738: stdout chunk (state=3): >>>import _imp # builtin <<< 30575 1726867565.08771: stdout chunk (state=3): >>>import '_thread' # <<< 30575 1726867565.08776: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 30575 1726867565.08843: stdout chunk (state=3): >>>import '_io' # <<< 30575 1726867565.08850: stdout chunk (state=3): >>>import 'marshal' # <<< 30575 1726867565.08886: stdout chunk (state=3): >>>import 'posix' # <<< 30575 1726867565.08916: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30575 1726867565.08948: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 30575 1726867565.08953: stdout chunk (state=3): >>># installed zipimport hook <<< 30575 1726867565.09004: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 30575 1726867565.09009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.09028: stdout chunk (state=3): >>>import '_codecs' # <<< 30575 1726867565.09050: stdout chunk (state=3): >>>import 'codecs' # <<< 30575 1726867565.09085: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30575 1726867565.09109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 30575 1726867565.09122: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec466184d0> <<< 30575 1726867565.09130: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec465e7b30> <<< 30575 1726867565.09152: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 30575 1726867565.09155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 30575 1726867565.09169: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4661aa50> <<< 30575 1726867565.09203: stdout chunk (state=3): >>>import '_signal' # <<< 30575 1726867565.09282: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 30575 1726867565.09285: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 30575 1726867565.09368: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30575 1726867565.09392: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 30575 1726867565.09421: stdout chunk (state=3): >>>import 'os' # <<< 30575 1726867565.09437: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 30575 1726867565.09466: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 30575 1726867565.09503: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 30575 1726867565.09527: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 30575 1726867565.09550: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec463e9130> <<< 30575 1726867565.09602: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 30575 1726867565.09614: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec463e9fa0> <<< 30575 1726867565.09642: stdout chunk (state=3): >>>import 'site' # <<< 30575 1726867565.09669: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30575 1726867565.10087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 30575 1726867565.10126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30575 1726867565.10152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30575 1726867565.10162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30575 1726867565.10225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 30575 1726867565.10228: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46427da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 30575 1726867565.10262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30575 1726867565.10268: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46427fb0> <<< 30575 1726867565.10297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30575 1726867565.10307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 30575 1726867565.10338: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30575 1726867565.10381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.10399: stdout chunk (state=3): >>>import 'itertools' # <<< 30575 1726867565.10434: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 30575 1726867565.10471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4645f770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 30575 1726867565.10474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4645fe00> <<< 30575 1726867565.10492: stdout chunk (state=3): >>>import '_collections' # <<< 30575 1726867565.10546: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4643fa40> <<< 30575 1726867565.10549: stdout chunk (state=3): >>>import '_functools' # <<< 30575 1726867565.10579: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4643d160> <<< 30575 1726867565.10670: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46424f50> <<< 30575 1726867565.10708: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30575 1726867565.10719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 30575 1726867565.10730: stdout chunk (state=3): >>>import '_sre' # <<< 30575 1726867565.10750: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30575 1726867565.10782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 30575 1726867565.10803: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30575 1726867565.10839: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4647f6b0> <<< 30575 1726867565.10855: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4647e2d0> <<< 30575 1726867565.10895: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 30575 1726867565.10906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4643e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4647cb60> <<< 30575 1726867565.10955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 30575 1726867565.10964: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b46b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464241d0> <<< 30575 1726867565.10990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30575 1726867565.11031: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464b4b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b4a10> <<< 30575 1726867565.11075: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464b4dd0> <<< 30575 1726867565.11087: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46422cf0> <<< 30575 1726867565.11133: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 30575 1726867565.11135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.11182: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30575 1726867565.11185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30575 1726867565.11195: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b54c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b5190> import 'importlib.machinery' # <<< 30575 1726867565.11235: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 30575 1726867565.11268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b63c0> import 'importlib.util' # <<< 30575 1726867565.11271: stdout chunk (state=3): >>>import 'runpy' # <<< 30575 1726867565.11293: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30575 1726867565.11346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30575 1726867565.11369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d05c0> <<< 30575 1726867565.11378: stdout chunk (state=3): >>>import 'errno' # <<< 30575 1726867565.11409: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.11440: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464d1d00> <<< 30575 1726867565.11476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 30575 1726867565.11480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30575 1726867565.11493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d2ba0> <<< 30575 1726867565.11535: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.11544: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464d3200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d20f0> <<< 30575 1726867565.11580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 30575 1726867565.11593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30575 1726867565.11624: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464d3c80> <<< 30575 1726867565.11640: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d33b0> <<< 30575 1726867565.11675: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b6330> <<< 30575 1726867565.11701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 30575 1726867565.11723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 30575 1726867565.11744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 30575 1726867565.11763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30575 1726867565.11800: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec461d7bf0> <<< 30575 1726867565.11823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 30575 1726867565.11854: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462006b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46200440> <<< 30575 1726867565.11886: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462006e0> <<< 30575 1726867565.11924: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30575 1726867565.12002: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.12185: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec46201010> <<< 30575 1726867565.12383: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462019d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462008c0> <<< 30575 1726867565.12389: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec461d5d90> <<< 30575 1726867565.12391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 30575 1726867565.12394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 30575 1726867565.12396: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 30575 1726867565.12496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46202d80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46201880> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b6ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30575 1726867565.12521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30575 1726867565.12548: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4622f0b0> <<< 30575 1726867565.12604: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30575 1726867565.12624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.12638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 30575 1726867565.12658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30575 1726867565.12700: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4624f440> <<< 30575 1726867565.12727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30575 1726867565.12768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30575 1726867565.12821: stdout chunk (state=3): >>>import 'ntpath' # <<< 30575 1726867565.12858: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462b0200> <<< 30575 1726867565.12894: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 30575 1726867565.12898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30575 1726867565.12921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30575 1726867565.12963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30575 1726867565.13045: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462b2960> <<< 30575 1726867565.13121: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462b0320> <<< 30575 1726867565.13154: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4627d220> <<< 30575 1726867565.13202: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b25370> <<< 30575 1726867565.13214: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4624e240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46203cb0> <<< 30575 1726867565.13387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30575 1726867565.13412: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec4624e5a0> <<< 30575 1726867565.13665: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_52sm2y83/ansible_ansible.legacy.setup_payload.zip' <<< 30575 1726867565.13679: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.13787: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.13816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 30575 1726867565.13828: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30575 1726867565.13868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30575 1726867565.13943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 30575 1726867565.13984: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b8b110> <<< 30575 1726867565.13994: stdout chunk (state=3): >>>import '_typing' # <<< 30575 1726867565.14170: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b6a000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b69160> <<< 30575 1726867565.14202: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.14219: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 30575 1726867565.14242: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.14268: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 30575 1726867565.15662: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.16793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b88fe0> <<< 30575 1726867565.16837: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.16873: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 30575 1726867565.16924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30575 1726867565.16928: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45bbab40> <<< 30575 1726867565.16951: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bba8d0> <<< 30575 1726867565.16999: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bba1e0> <<< 30575 1726867565.17046: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30575 1726867565.17078: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bba930> <<< 30575 1726867565.17082: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b8bda0> import 'atexit' # <<< 30575 1726867565.17114: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45bbb890> <<< 30575 1726867565.17121: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45bbba40> <<< 30575 1726867565.17149: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 30575 1726867565.17197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 30575 1726867565.17209: stdout chunk (state=3): >>>import '_locale' # <<< 30575 1726867565.17249: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bbbf50> <<< 30575 1726867565.17263: stdout chunk (state=3): >>>import 'pwd' # <<< 30575 1726867565.17274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30575 1726867565.17303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30575 1726867565.17341: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a25ca0> <<< 30575 1726867565.17376: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a278c0> <<< 30575 1726867565.17395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30575 1726867565.17416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 30575 1726867565.17448: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a282c0> <<< 30575 1726867565.17470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30575 1726867565.17495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 30575 1726867565.17516: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a29460> <<< 30575 1726867565.17534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 30575 1726867565.17570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 30575 1726867565.17597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 30575 1726867565.17601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30575 1726867565.17647: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a2bf50> <<< 30575 1726867565.17685: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.17694: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462002f0> <<< 30575 1726867565.17710: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a2a210> <<< 30575 1726867565.17727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30575 1726867565.17759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 30575 1726867565.17772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 30575 1726867565.17789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30575 1726867565.17798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30575 1726867565.17897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30575 1726867565.17906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 30575 1726867565.17925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 30575 1726867565.17932: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a33d40> <<< 30575 1726867565.17945: stdout chunk (state=3): >>>import '_tokenize' # <<< 30575 1726867565.18007: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a32810> <<< 30575 1726867565.18016: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a32570> <<< 30575 1726867565.18029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 30575 1726867565.18044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30575 1726867565.18112: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a32ae0> <<< 30575 1726867565.18144: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a2b440> <<< 30575 1726867565.18166: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.18190: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a77fe0> <<< 30575 1726867565.18205: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 30575 1726867565.18207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a781a0> <<< 30575 1726867565.18230: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 30575 1726867565.18242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 30575 1726867565.18268: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 30575 1726867565.18270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 30575 1726867565.18300: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.18306: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a79c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a799d0> <<< 30575 1726867565.18325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 30575 1726867565.18353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30575 1726867565.18403: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.18406: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a7c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a7a2d0> <<< 30575 1726867565.18434: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30575 1726867565.18466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.18498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 30575 1726867565.18508: stdout chunk (state=3): >>>import '_string' # <<< 30575 1726867565.18551: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a7f8c0> <<< 30575 1726867565.18672: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a7c290> <<< 30575 1726867565.18738: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.18740: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a80650> <<< 30575 1726867565.18781: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.18790: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a808f0> <<< 30575 1726867565.18850: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a80bc0> <<< 30575 1726867565.18855: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a78350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 30575 1726867565.18902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 30575 1726867565.18912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 30575 1726867565.18953: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.18956: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4590c1a0> <<< 30575 1726867565.19112: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4590d5b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a82930> <<< 30575 1726867565.19144: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a83ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a825d0> <<< 30575 1726867565.19169: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30575 1726867565.19200: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 30575 1726867565.19285: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.19376: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.19390: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 30575 1726867565.19431: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 30575 1726867565.19455: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.19554: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.19666: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.20187: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.20709: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30575 1726867565.20719: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 30575 1726867565.20734: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 30575 1726867565.20757: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30575 1726867565.20768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.20819: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec459117c0> <<< 30575 1726867565.20898: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30575 1726867565.20924: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459125a0> <<< 30575 1726867565.20928: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4590d6d0> <<< 30575 1726867565.20979: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 30575 1726867565.20988: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.21017: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.21020: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 30575 1726867565.21042: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.21185: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.21336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 30575 1726867565.21354: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45912510> <<< 30575 1726867565.21361: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.21827: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22254: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22329: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22430: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 30575 1726867565.22483: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30575 1726867565.22511: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 30575 1726867565.22558: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22649: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 30575 1726867565.22655: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22684: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 30575 1726867565.22692: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22728: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22767: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30575 1726867565.22780: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.22997: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30575 1726867565.23282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30575 1726867565.23302: stdout chunk (state=3): >>>import '_ast' # <<< 30575 1726867565.23357: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459135c0> <<< 30575 1726867565.23373: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23444: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23526: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 30575 1726867565.23537: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 30575 1726867565.23562: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23601: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23646: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 30575 1726867565.23651: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23702: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23744: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23803: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.23867: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30575 1726867565.23907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.23987: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.23993: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4591e030> <<< 30575 1726867565.24028: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4591b080> <<< 30575 1726867565.24070: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 30575 1726867565.24079: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 30575 1726867565.24082: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24146: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24208: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24241: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24285: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.24327: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30575 1726867565.24330: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 30575 1726867565.24347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30575 1726867565.24409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30575 1726867565.24429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 30575 1726867565.24442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30575 1726867565.24498: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a06840> <<< 30575 1726867565.24539: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45afe510> <<< 30575 1726867565.24625: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4591ddc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a82120> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 30575 1726867565.24658: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24661: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24691: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 30575 1726867565.24759: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 30575 1726867565.24775: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24784: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 30575 1726867565.24790: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24856: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24920: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.24972: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30575 1726867565.25001: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25046: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25087: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 30575 1726867565.25137: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25210: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25275: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25302: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25341: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 30575 1726867565.25344: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25524: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25697: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25744: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.25795: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.25818: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 30575 1726867565.25836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 30575 1726867565.25856: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 30575 1726867565.25883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 30575 1726867565.25904: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b21e0> <<< 30575 1726867565.25925: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 30575 1726867565.25933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 30575 1726867565.25956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 30575 1726867565.25997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 30575 1726867565.26021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 30575 1726867565.26034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 30575 1726867565.26048: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45603f20> <<< 30575 1726867565.26087: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.26090: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45608320> <<< 30575 1726867565.26153: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459a2e10> <<< 30575 1726867565.26156: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b2d50> <<< 30575 1726867565.26204: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b0950> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b04a0> <<< 30575 1726867565.26229: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 30575 1726867565.26292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 30575 1726867565.26320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 30575 1726867565.26326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 30575 1726867565.26351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 30575 1726867565.26356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 30575 1726867565.26390: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4560b2f0> <<< 30575 1726867565.26395: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4560aba0> <<< 30575 1726867565.26424: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.26441: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4560ad80> <<< 30575 1726867565.26454: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45609fd0> <<< 30575 1726867565.26473: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 30575 1726867565.26579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 30575 1726867565.26584: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4560b3b0> <<< 30575 1726867565.26609: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 30575 1726867565.26638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 30575 1726867565.26673: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4566dee0> <<< 30575 1726867565.26705: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4560bec0> <<< 30575 1726867565.26739: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b0620> <<< 30575 1726867565.26742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 30575 1726867565.26756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 30575 1726867565.26775: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.26787: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 30575 1726867565.26805: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.26861: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.26914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 30575 1726867565.26937: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.26985: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 30575 1726867565.27051: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27074: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 30575 1726867565.27080: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27118: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 30575 1726867565.27153: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27205: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27257: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 30575 1726867565.27262: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27317: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27362: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 30575 1726867565.27375: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27432: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.27654: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30575 1726867565.27687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 30575 1726867565.27699: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28099: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28545: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 30575 1726867565.28591: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28651: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28674: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28708: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 30575 1726867565.28732: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28764: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 30575 1726867565.28801: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28859: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28912: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 30575 1726867565.28928: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28961: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.28993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 30575 1726867565.29000: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29031: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29061: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 30575 1726867565.29070: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29153: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29242: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 30575 1726867565.29248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 30575 1726867565.29266: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4566ffe0> <<< 30575 1726867565.29294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 30575 1726867565.29316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 30575 1726867565.29430: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4566eab0> import 'ansible.module_utils.facts.system.local' # <<< 30575 1726867565.29450: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29515: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 30575 1726867565.29585: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29680: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29770: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 30575 1726867565.29780: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29842: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29916: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 30575 1726867565.29922: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.29967: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 30575 1726867565.30062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 30575 1726867565.30131: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.30192: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec456a61e0> <<< 30575 1726867565.30372: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45695fd0> <<< 30575 1726867565.30384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 30575 1726867565.30391: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30446: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 30575 1726867565.30512: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30591: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30672: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30786: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30929: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 30575 1726867565.30945: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.30983: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 30575 1726867565.31034: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31074: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 30575 1726867565.31131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 30575 1726867565.31153: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867565.31176: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec456bdac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45697200> <<< 30575 1726867565.31190: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 30575 1726867565.31199: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31214: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 30575 1726867565.31239: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31273: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31316: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 30575 1726867565.31322: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31481: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31628: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 30575 1726867565.31642: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31740: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31842: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31885: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 30575 1726867565.31929: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 30575 1726867565.31948: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31964: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.31992: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.32126: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.32265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 30575 1726867565.32281: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.32401: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.32524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 30575 1726867565.32536: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.32569: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.32603: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.33168: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.33675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 30575 1726867565.33680: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 30575 1726867565.33689: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.33799: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.33902: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 30575 1726867565.33911: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34008: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 30575 1726867565.34114: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34272: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34424: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 30575 1726867565.34449: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34465: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 30575 1726867565.34475: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34521: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 30575 1726867565.34582: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34667: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34767: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.34964: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 30575 1726867565.35186: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35225: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 30575 1726867565.35267: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35297: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35327: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 30575 1726867565.35338: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35402: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35472: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 30575 1726867565.35479: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35507: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 30575 1726867565.35544: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35598: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35659: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 30575 1726867565.35665: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35729: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.35789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 30575 1726867565.35796: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36054: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36316: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 30575 1726867565.36321: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36383: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 30575 1726867565.36455: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36490: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36532: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 30575 1726867565.36537: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36574: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 30575 1726867565.36621: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36650: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 30575 1726867565.36692: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36773: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36848: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 30575 1726867565.36880: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36883: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 30575 1726867565.36900: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36950: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.36988: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 30575 1726867565.37003: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37024: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37046: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37093: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37143: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37213: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37282: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 30575 1726867565.37298: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 30575 1726867565.37309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 30575 1726867565.37363: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37416: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 30575 1726867565.37425: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37610: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 30575 1726867565.37809: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37858: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37897: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 30575 1726867565.37917: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.37960: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.38011: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 30575 1726867565.38017: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.38100: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.38191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 30575 1726867565.38201: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 30575 1726867565.38206: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.38296: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.38385: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 30575 1726867565.38390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 30575 1726867565.38461: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867565.38652: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 30575 1726867565.38658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 30575 1726867565.38684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 30575 1726867565.38694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 30575 1726867565.38730: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45453110> <<< 30575 1726867565.38743: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45450cb0> <<< 30575 1726867565.38792: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45450e90> <<< 30575 1726867565.53764: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 30575 1726867565.53769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45499010> <<< 30575 1726867565.53809: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 30575 1726867565.53841: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45499df0> <<< 30575 1726867565.53885: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 30575 1726867565.53906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867565.53948: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 30575 1726867565.53983: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec454e0530> <<< 30575 1726867565.53996: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec454e0050> <<< 30575 1726867565.54207: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 30575 1726867565.74557: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "26", "second": "05", "epoch": "1726867565", "epoch_int": "1726867565", "date": "2024-09-20", "time": "17:26:05", "iso8601_micro": "2024-09-20T21:26:05.384433Z", "iso8601": "2024-09-20T21:26:05Z", "iso8601_basic": "20240920T172605384433", "iso8601_basic_short": "20240920T172605", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dist<<< 30575 1726867565.74594: stdout chunk (state=3): >>>ribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.8916015625, "5m": 0.640625, "15m": 0.35888671875}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2979, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 552, "free": 2979}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 802, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261800341504, "block_size": 4096, "block_total": 65519099, "block_available": 63916099, "block_used": 1603000, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30575 1726867565.75169: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv <<< 30575 1726867565.75174: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 30575 1726867565.75284: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 30575 1726867565.75291: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator <<< 30575 1726867565.75302: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 30575 1726867565.75378: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket <<< 30575 1726867565.75387: stdout chunk (state=3): >>># cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local <<< 30575 1726867565.75393: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios <<< 30575 1726867565.75396: stdout chunk (state=3): >>># cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd <<< 30575 1726867565.75417: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn <<< 30575 1726867565.75504: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 30575 1726867565.75801: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30575 1726867565.75804: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 30575 1726867565.75835: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 30575 1726867565.75914: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 30575 1726867565.76099: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool <<< 30575 1726867565.76155: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl <<< 30575 1726867565.76187: stdout chunk (state=3): >>># destroy datetime <<< 30575 1726867565.76226: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 30575 1726867565.76338: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct <<< 30575 1726867565.76342: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 30575 1726867565.76564: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 30575 1726867565.76568: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 30575 1726867565.76571: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc <<< 30575 1726867565.76573: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 30575 1726867565.76575: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 30575 1726867565.76582: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 30575 1726867565.76584: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 30575 1726867565.76869: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30575 1726867565.77087: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 30575 1726867565.77482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867565.77492: stdout chunk (state=3): >>><<< 30575 1726867565.77508: stderr chunk (state=3): >>><<< 30575 1726867565.77683: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec466184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec465e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4661aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec463e9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec463e9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46427da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46427fb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4645f770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4645fe00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4643fa40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4643d160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46424f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4647f6b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4647e2d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4643e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4647cb60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b46b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464241d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464b4b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b4a10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464b4dd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46422cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b54c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b5190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b63c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d05c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464d1d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d2ba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464d3200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d20f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec464d3c80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464d33b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b6330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec461d7bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462006b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46200440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462006e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec46201010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462019d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462008c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec461d5d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46202d80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46201880> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec464b6ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4622f0b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4624f440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462b0200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462b2960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec462b0320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4627d220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b25370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4624e240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec46203cb0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec4624e5a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_52sm2y83/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b8b110> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b6a000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b69160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b88fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45bbab40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bba8d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bba1e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bba930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45b8bda0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45bbb890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45bbba40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45bbbf50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a25ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a278c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a282c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a29460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a2bf50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec462002f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a2a210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a33d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a32810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a32570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a32ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a2b440> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a77fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a79c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a799d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a7c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a7a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a7f8c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a7c290> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a80650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a808f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a80bc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a78350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4590c1a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4590d5b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a82930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45a83ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a825d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec459117c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459125a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4590d6d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45912510> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459135c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4591e030> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4591b080> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a06840> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45afe510> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4591ddc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45a82120> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b21e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45603f20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45608320> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459a2e10> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b2d50> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b0950> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b04a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4560b2f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4560aba0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4560ad80> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45609fd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4560b3b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec4566dee0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4560bec0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec459b0620> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4566ffe0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec4566eab0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec456a61e0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45695fd0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec456bdac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45697200> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec45453110> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45450cb0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45450e90> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45499010> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec45499df0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec454e0530> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec454e0050> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "26", "second": "05", "epoch": "1726867565", "epoch_int": "1726867565", "date": "2024-09-20", "time": "17:26:05", "iso8601_micro": "2024-09-20T21:26:05.384433Z", "iso8601": "2024-09-20T21:26:05Z", "iso8601_basic": "20240920T172605384433", "iso8601_basic_short": "20240920T172605", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.8916015625, "5m": 0.640625, "15m": 0.35888671875}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2979, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 552, "free": 2979}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 802, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261800341504, "block_size": 4096, "block_total": 65519099, "block_available": 63916099, "block_used": 1603000, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 30575 1726867565.79120: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867565.79126: _low_level_execute_command(): starting 30575 1726867565.79129: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867564.6778486-30587-209955700405070/ > /dev/null 2>&1 && sleep 0' 30575 1726867565.79592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867565.79792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867565.79849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.79853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867565.79855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867565.79857: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867565.79860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.79863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867565.79867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867565.79940: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867565.80195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867565.80404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867565.82283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867565.82286: stdout chunk (state=3): >>><<< 30575 1726867565.82289: stderr chunk (state=3): >>><<< 30575 1726867565.82291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867565.82297: handler run complete 30575 1726867565.82473: variable 'ansible_facts' from source: unknown 30575 1726867565.82633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867565.83351: variable 'ansible_facts' from source: unknown 30575 1726867565.83546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867565.83857: attempt loop complete, returning result 30575 1726867565.83860: _execute() done 30575 1726867565.83863: dumping result to json 30575 1726867565.83926: done dumping result, returning 30575 1726867565.84101: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcac9-a3a5-e081-a588-00000000001b] 30575 1726867565.84104: sending task result for task 0affcac9-a3a5-e081-a588-00000000001b ok: [managed_node3] 30575 1726867565.84856: no more pending results, returning what we have 30575 1726867565.84860: results queue empty 30575 1726867565.84861: checking for any_errors_fatal 30575 1726867565.84862: done checking for any_errors_fatal 30575 1726867565.84863: checking for max_fail_percentage 30575 1726867565.84865: done checking for max_fail_percentage 30575 1726867565.84865: checking to see if all hosts have failed and the running result is not ok 30575 1726867565.84866: done checking to see if all hosts have failed 30575 1726867565.84867: getting the remaining hosts for this loop 30575 1726867565.84869: done getting the remaining hosts for this loop 30575 1726867565.84873: getting the next task for host managed_node3 30575 1726867565.85039: done getting next task for host managed_node3 30575 1726867565.85041: ^ task is: TASK: meta (flush_handlers) 30575 1726867565.85043: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867565.85047: getting variables 30575 1726867565.85049: in VariableManager get_vars() 30575 1726867565.85076: Calling all_inventory to load vars for managed_node3 30575 1726867565.85082: Calling groups_inventory to load vars for managed_node3 30575 1726867565.85085: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867565.85095: Calling all_plugins_play to load vars for managed_node3 30575 1726867565.85132: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867565.85138: Calling groups_plugins_play to load vars for managed_node3 30575 1726867565.85341: done sending task result for task 0affcac9-a3a5-e081-a588-00000000001b 30575 1726867565.85345: WORKER PROCESS EXITING 30575 1726867565.85357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867565.85551: done with get_vars() 30575 1726867565.85561: done getting variables 30575 1726867565.85617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 30575 1726867565.85679: in VariableManager get_vars() 30575 1726867565.85689: Calling all_inventory to load vars for managed_node3 30575 1726867565.85691: Calling groups_inventory to load vars for managed_node3 30575 1726867565.85693: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867565.85697: Calling all_plugins_play to load vars for managed_node3 30575 1726867565.85699: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867565.85702: Calling groups_plugins_play to load vars for managed_node3 30575 1726867565.85835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867565.85991: done with get_vars() 30575 1726867565.86006: done queuing things up, now waiting for results queue to drain 30575 1726867565.86008: results queue empty 30575 1726867565.86009: checking for any_errors_fatal 30575 1726867565.86011: done checking for any_errors_fatal 30575 1726867565.86011: checking for max_fail_percentage 30575 1726867565.86012: done checking for max_fail_percentage 30575 1726867565.86013: checking to see if all hosts have failed and the running result is not ok 30575 1726867565.86013: done checking to see if all hosts have failed 30575 1726867565.86014: getting the remaining hosts for this loop 30575 1726867565.86015: done getting the remaining hosts for this loop 30575 1726867565.86017: getting the next task for host managed_node3 30575 1726867565.86021: done getting next task for host managed_node3 30575 1726867565.86025: ^ task is: TASK: Include the task 'el_repo_setup.yml' 30575 1726867565.86026: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867565.86028: getting variables 30575 1726867565.86029: in VariableManager get_vars() 30575 1726867565.86036: Calling all_inventory to load vars for managed_node3 30575 1726867565.86038: Calling groups_inventory to load vars for managed_node3 30575 1726867565.86040: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867565.86043: Calling all_plugins_play to load vars for managed_node3 30575 1726867565.86045: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867565.86047: Calling groups_plugins_play to load vars for managed_node3 30575 1726867565.86203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867565.86384: done with get_vars() 30575 1726867565.86393: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Friday 20 September 2024 17:26:05 -0400 (0:00:01.223) 0:00:01.242 ****** 30575 1726867565.86471: entering _queue_task() for managed_node3/include_tasks 30575 1726867565.86473: Creating lock for include_tasks 30575 1726867565.86767: worker is 1 (out of 1 available) 30575 1726867565.86883: exiting _queue_task() for managed_node3/include_tasks 30575 1726867565.86895: done queuing things up, now waiting for results queue to drain 30575 1726867565.86897: waiting for pending results... 30575 1726867565.87093: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 30575 1726867565.87149: in run() - task 0affcac9-a3a5-e081-a588-000000000006 30575 1726867565.87168: variable 'ansible_search_path' from source: unknown 30575 1726867565.87209: calling self._execute() 30575 1726867565.87588: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867565.87593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867565.87596: variable 'omit' from source: magic vars 30575 1726867565.87784: _execute() done 30575 1726867565.87787: dumping result to json 30575 1726867565.87790: done dumping result, returning 30575 1726867565.87792: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-e081-a588-000000000006] 30575 1726867565.87794: sending task result for task 0affcac9-a3a5-e081-a588-000000000006 30575 1726867565.87870: done sending task result for task 0affcac9-a3a5-e081-a588-000000000006 30575 1726867565.87873: WORKER PROCESS EXITING 30575 1726867565.87914: no more pending results, returning what we have 30575 1726867565.87919: in VariableManager get_vars() 30575 1726867565.87955: Calling all_inventory to load vars for managed_node3 30575 1726867565.87958: Calling groups_inventory to load vars for managed_node3 30575 1726867565.87962: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867565.87974: Calling all_plugins_play to load vars for managed_node3 30575 1726867565.87981: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867565.87985: Calling groups_plugins_play to load vars for managed_node3 30575 1726867565.88457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867565.88851: done with get_vars() 30575 1726867565.88858: variable 'ansible_search_path' from source: unknown 30575 1726867565.88873: we have included files to process 30575 1726867565.88874: generating all_blocks data 30575 1726867565.88876: done generating all_blocks data 30575 1726867565.88878: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30575 1726867565.88880: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30575 1726867565.88883: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30575 1726867565.89756: in VariableManager get_vars() 30575 1726867565.89772: done with get_vars() 30575 1726867565.89786: done processing included file 30575 1726867565.89788: iterating over new_blocks loaded from include file 30575 1726867565.89790: in VariableManager get_vars() 30575 1726867565.89800: done with get_vars() 30575 1726867565.89801: filtering new block on tags 30575 1726867565.89816: done filtering new block on tags 30575 1726867565.89819: in VariableManager get_vars() 30575 1726867565.89832: done with get_vars() 30575 1726867565.89833: filtering new block on tags 30575 1726867565.89848: done filtering new block on tags 30575 1726867565.89851: in VariableManager get_vars() 30575 1726867565.89862: done with get_vars() 30575 1726867565.89863: filtering new block on tags 30575 1726867565.89876: done filtering new block on tags 30575 1726867565.89880: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 30575 1726867565.89886: extending task lists for all hosts with included blocks 30575 1726867565.89937: done extending task lists 30575 1726867565.89938: done processing included files 30575 1726867565.89939: results queue empty 30575 1726867565.89940: checking for any_errors_fatal 30575 1726867565.89941: done checking for any_errors_fatal 30575 1726867565.89942: checking for max_fail_percentage 30575 1726867565.89943: done checking for max_fail_percentage 30575 1726867565.89944: checking to see if all hosts have failed and the running result is not ok 30575 1726867565.89944: done checking to see if all hosts have failed 30575 1726867565.89945: getting the remaining hosts for this loop 30575 1726867565.89946: done getting the remaining hosts for this loop 30575 1726867565.89949: getting the next task for host managed_node3 30575 1726867565.89954: done getting next task for host managed_node3 30575 1726867565.89956: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 30575 1726867565.89958: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867565.89960: getting variables 30575 1726867565.89961: in VariableManager get_vars() 30575 1726867565.89970: Calling all_inventory to load vars for managed_node3 30575 1726867565.89971: Calling groups_inventory to load vars for managed_node3 30575 1726867565.89974: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867565.89981: Calling all_plugins_play to load vars for managed_node3 30575 1726867565.89983: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867565.89986: Calling groups_plugins_play to load vars for managed_node3 30575 1726867565.90141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867565.90322: done with get_vars() 30575 1726867565.90333: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:26:05 -0400 (0:00:00.039) 0:00:01.281 ****** 30575 1726867565.90396: entering _queue_task() for managed_node3/setup 30575 1726867565.90652: worker is 1 (out of 1 available) 30575 1726867565.90664: exiting _queue_task() for managed_node3/setup 30575 1726867565.90681: done queuing things up, now waiting for results queue to drain 30575 1726867565.90683: waiting for pending results... 30575 1726867565.90940: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 30575 1726867565.90990: in run() - task 0affcac9-a3a5-e081-a588-00000000002c 30575 1726867565.90999: variable 'ansible_search_path' from source: unknown 30575 1726867565.91002: variable 'ansible_search_path' from source: unknown 30575 1726867565.91032: calling self._execute() 30575 1726867565.91100: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867565.91107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867565.91114: variable 'omit' from source: magic vars 30575 1726867565.91647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867565.94465: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867565.94542: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867565.94569: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867565.94763: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867565.94784: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867565.94858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867565.94889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867565.94915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867565.94983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867565.94987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867565.95120: variable 'ansible_facts' from source: unknown 30575 1726867565.95182: variable 'network_test_required_facts' from source: task vars 30575 1726867565.95283: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 30575 1726867565.95287: variable 'omit' from source: magic vars 30575 1726867565.95289: variable 'omit' from source: magic vars 30575 1726867565.95291: variable 'omit' from source: magic vars 30575 1726867565.95313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867565.95343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867565.95358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867565.95375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867565.95387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867565.95416: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867565.95420: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867565.95422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867565.95526: Set connection var ansible_pipelining to False 30575 1726867565.95529: Set connection var ansible_shell_type to sh 30575 1726867565.95532: Set connection var ansible_shell_executable to /bin/sh 30575 1726867565.95534: Set connection var ansible_timeout to 10 30575 1726867565.95536: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867565.95538: Set connection var ansible_connection to ssh 30575 1726867565.95636: variable 'ansible_shell_executable' from source: unknown 30575 1726867565.95639: variable 'ansible_connection' from source: unknown 30575 1726867565.95642: variable 'ansible_module_compression' from source: unknown 30575 1726867565.95644: variable 'ansible_shell_type' from source: unknown 30575 1726867565.95646: variable 'ansible_shell_executable' from source: unknown 30575 1726867565.95648: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867565.95650: variable 'ansible_pipelining' from source: unknown 30575 1726867565.95652: variable 'ansible_timeout' from source: unknown 30575 1726867565.95654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867565.95709: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867565.95717: variable 'omit' from source: magic vars 30575 1726867565.95720: starting attempt loop 30575 1726867565.95725: running the handler 30575 1726867565.95744: _low_level_execute_command(): starting 30575 1726867565.95746: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867565.96490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867565.96499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867565.96635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.96649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867565.96663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867565.96670: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867565.96682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.96726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867565.96730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867565.96733: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867565.96735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867565.96738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.96740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867565.96742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867565.96836: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867565.96840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.96890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867565.96909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867565.97112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867565.97160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867565.98854: stdout chunk (state=3): >>>/root <<< 30575 1726867565.98951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867565.98980: stderr chunk (state=3): >>><<< 30575 1726867565.98986: stdout chunk (state=3): >>><<< 30575 1726867565.99008: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867565.99019: _low_level_execute_command(): starting 30575 1726867565.99027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857 `" && echo ansible-tmp-1726867565.9900854-30638-37395391382857="` echo /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857 `" ) && sleep 0' 30575 1726867565.99542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867565.99551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867565.99561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.99575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867565.99589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867565.99603: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867565.99606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.99619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867565.99628: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867565.99683: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867565.99686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867565.99688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867565.99691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867565.99693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867565.99695: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867565.99697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867565.99787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867565.99829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867565.99836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.00053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.01800: stdout chunk (state=3): >>>ansible-tmp-1726867565.9900854-30638-37395391382857=/root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857 <<< 30575 1726867566.02026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.02030: stderr chunk (state=3): >>><<< 30575 1726867566.02032: stdout chunk (state=3): >>><<< 30575 1726867566.02043: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867565.9900854-30638-37395391382857=/root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867566.02093: variable 'ansible_module_compression' from source: unknown 30575 1726867566.02138: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30575 1726867566.02300: variable 'ansible_facts' from source: unknown 30575 1726867566.02684: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/AnsiballZ_setup.py 30575 1726867566.02898: Sending initial data 30575 1726867566.02909: Sent initial data (153 bytes) 30575 1726867566.03397: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867566.03412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.03426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.03445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867566.03464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867566.03479: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867566.03495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.03514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867566.03527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867566.03539: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867566.03597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.03642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867566.03665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.03685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.03754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.05324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867566.05359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867566.05415: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp144bl3ir /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/AnsiballZ_setup.py <<< 30575 1726867566.05437: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/AnsiballZ_setup.py" <<< 30575 1726867566.05671: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp144bl3ir" to remote "/root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/AnsiballZ_setup.py" <<< 30575 1726867566.06956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.07010: stderr chunk (state=3): >>><<< 30575 1726867566.07013: stdout chunk (state=3): >>><<< 30575 1726867566.07048: done transferring module to remote 30575 1726867566.07051: _low_level_execute_command(): starting 30575 1726867566.07053: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/ /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/AnsiballZ_setup.py && sleep 0' 30575 1726867566.07462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.07466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867566.07468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867566.07470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.07472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.07521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.07529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.07571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.09365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.09368: stdout chunk (state=3): >>><<< 30575 1726867566.09371: stderr chunk (state=3): >>><<< 30575 1726867566.09484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867566.09488: _low_level_execute_command(): starting 30575 1726867566.09492: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/AnsiballZ_setup.py && sleep 0' 30575 1726867566.09990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867566.10011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.10057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.10060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.10062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867566.10064: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.10066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.10117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.10151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.10190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.12303: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30575 1726867566.12373: stdout chunk (state=3): >>>import _imp # builtin <<< 30575 1726867566.12415: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 30575 1726867566.12451: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 30575 1726867566.12489: stdout chunk (state=3): >>>import 'posix' # <<< 30575 1726867566.12537: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30575 1726867566.12555: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 30575 1726867566.12597: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.12616: stdout chunk (state=3): >>>import '_codecs' # <<< 30575 1726867566.12642: stdout chunk (state=3): >>>import 'codecs' # <<< 30575 1726867566.12673: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30575 1726867566.12705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 30575 1726867566.12713: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f92184d0> <<< 30575 1726867566.12716: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f91e7b30> <<< 30575 1726867566.12749: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 30575 1726867566.12771: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f921aa50> <<< 30575 1726867566.12778: stdout chunk (state=3): >>>import '_signal' # <<< 30575 1726867566.12801: stdout chunk (state=3): >>>import '_abc' # <<< 30575 1726867566.12807: stdout chunk (state=3): >>>import 'abc' # <<< 30575 1726867566.12825: stdout chunk (state=3): >>>import 'io' # <<< 30575 1726867566.12859: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 30575 1726867566.12946: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30575 1726867566.12975: stdout chunk (state=3): >>>import 'genericpath' # <<< 30575 1726867566.12979: stdout chunk (state=3): >>>import 'posixpath' # <<< 30575 1726867566.12998: stdout chunk (state=3): >>>import 'os' # <<< 30575 1726867566.13029: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 30575 1726867566.13032: stdout chunk (state=3): >>>Processing user site-packages <<< 30575 1726867566.13053: stdout chunk (state=3): >>>Processing global site-packages <<< 30575 1726867566.13056: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 30575 1726867566.13059: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 30575 1726867566.13093: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 30575 1726867566.13117: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f902d130> <<< 30575 1726867566.13174: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 30575 1726867566.13185: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.13190: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f902dfa0> <<< 30575 1726867566.13216: stdout chunk (state=3): >>>import 'site' # <<< 30575 1726867566.13242: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30575 1726867566.13627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 30575 1726867566.13635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30575 1726867566.13660: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 30575 1726867566.13663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.13692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30575 1726867566.13725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30575 1726867566.13747: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30575 1726867566.13765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 30575 1726867566.13788: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f906be90> <<< 30575 1726867566.13799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 30575 1726867566.13820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30575 1726867566.13845: stdout chunk (state=3): >>>import '_operator' # <<< 30575 1726867566.13852: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f906bf50> <<< 30575 1726867566.13864: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30575 1726867566.13892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 30575 1726867566.13916: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30575 1726867566.13963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.13980: stdout chunk (state=3): >>>import 'itertools' # <<< 30575 1726867566.14006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 30575 1726867566.14015: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90a3830> <<< 30575 1726867566.14033: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 30575 1726867566.14050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90a3ec0> <<< 30575 1726867566.14065: stdout chunk (state=3): >>>import '_collections' # <<< 30575 1726867566.14106: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9083b60> <<< 30575 1726867566.14124: stdout chunk (state=3): >>>import '_functools' # <<< 30575 1726867566.14149: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9081280> <<< 30575 1726867566.14239: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9069040> <<< 30575 1726867566.14260: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30575 1726867566.14287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 30575 1726867566.14290: stdout chunk (state=3): >>>import '_sre' # <<< 30575 1726867566.14316: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30575 1726867566.14338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 30575 1726867566.14363: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30575 1726867566.14403: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90c37d0> <<< 30575 1726867566.14407: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90c23f0> <<< 30575 1726867566.14444: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 30575 1726867566.14449: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9082150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90c0c20> <<< 30575 1726867566.14513: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 30575 1726867566.14517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f8860> <<< 30575 1726867566.14519: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90682c0> <<< 30575 1726867566.14541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 30575 1726867566.14546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30575 1726867566.14575: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.14583: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f90f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f8bc0> <<< 30575 1726867566.14623: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f90f8f80> <<< 30575 1726867566.14631: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9066de0> <<< 30575 1726867566.14661: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 30575 1726867566.14666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.14684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30575 1726867566.14725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30575 1726867566.14735: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f9610> <<< 30575 1726867566.14743: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f92e0> import 'importlib.machinery' # <<< 30575 1726867566.14768: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 30575 1726867566.14780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 30575 1726867566.14805: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90fa510> <<< 30575 1726867566.14808: stdout chunk (state=3): >>>import 'importlib.util' # <<< 30575 1726867566.14823: stdout chunk (state=3): >>>import 'runpy' # <<< 30575 1726867566.14840: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30575 1726867566.14869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30575 1726867566.14994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9110710> <<< 30575 1726867566.15098: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f9111df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 30575 1726867566.15110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9112c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f91132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f91121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30575 1726867566.15132: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.15150: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f9113d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f91134a0> <<< 30575 1726867566.15219: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90fa540> <<< 30575 1726867566.15222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 30575 1726867566.15237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 30575 1726867566.15253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 30575 1726867566.15330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30575 1726867566.15333: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.15389: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e1bbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 30575 1726867566.15412: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e446b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e44410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e446e0> <<< 30575 1726867566.15438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30575 1726867566.15496: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.15623: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e45010> <<< 30575 1726867566.15780: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.15882: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e45a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e448c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e19d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 30575 1726867566.15887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 30575 1726867566.15889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e46e10> <<< 30575 1726867566.15892: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e45b50> <<< 30575 1726867566.15916: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30575 1726867566.15982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.16000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30575 1726867566.16038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30575 1726867566.16063: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e6f1a0> <<< 30575 1726867566.16116: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30575 1726867566.16136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.16174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30575 1726867566.16225: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e93530> <<< 30575 1726867566.16229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30575 1726867566.16279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30575 1726867566.16321: stdout chunk (state=3): >>>import 'ntpath' # <<< 30575 1726867566.16488: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8ef4290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30575 1726867566.16541: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8ef69f0> <<< 30575 1726867566.16617: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8ef43b0> <<< 30575 1726867566.16649: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8eb92e0> <<< 30575 1726867566.16709: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8cfd3a0> <<< 30575 1726867566.16716: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e92360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e47d70> <<< 30575 1726867566.16883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30575 1726867566.16897: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f19f8cfd640> <<< 30575 1726867566.17145: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_v9_f8m04/ansible_setup_payload.zip' <<< 30575 1726867566.17166: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.17281: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.17301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30575 1726867566.17495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d67080> import '_typing' # <<< 30575 1726867566.17637: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d45f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d45100> # zipimport: zlib available <<< 30575 1726867566.17674: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 30575 1726867566.17702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30575 1726867566.17728: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 30575 1726867566.17746: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.19126: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.20244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 30575 1726867566.20249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d64f50> <<< 30575 1726867566.20274: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.20302: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 30575 1726867566.20334: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30575 1726867566.20364: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8d96ab0> <<< 30575 1726867566.20405: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d96840> <<< 30575 1726867566.20432: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d96150> <<< 30575 1726867566.20455: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 30575 1726867566.20460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30575 1726867566.20508: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d965a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d67d10> <<< 30575 1726867566.20511: stdout chunk (state=3): >>>import 'atexit' # <<< 30575 1726867566.20541: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.20546: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8d97830> <<< 30575 1726867566.20572: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.20575: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8d97a70> <<< 30575 1726867566.20597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 30575 1726867566.20641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 30575 1726867566.20655: stdout chunk (state=3): >>>import '_locale' # <<< 30575 1726867566.20700: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d97fb0> <<< 30575 1726867566.20714: stdout chunk (state=3): >>>import 'pwd' # <<< 30575 1726867566.20728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30575 1726867566.20754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30575 1726867566.20786: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f872dd60> <<< 30575 1726867566.20819: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f872f4a0> <<< 30575 1726867566.20848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30575 1726867566.20853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 30575 1726867566.20896: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87302f0> <<< 30575 1726867566.20908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30575 1726867566.20946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 30575 1726867566.20956: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8731490> <<< 30575 1726867566.20979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 30575 1726867566.21011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 30575 1726867566.21035: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 30575 1726867566.21039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30575 1726867566.21088: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8733f50> <<< 30575 1726867566.21127: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.21134: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8738290> <<< 30575 1726867566.21144: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8732240> <<< 30575 1726867566.21170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30575 1726867566.21190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 30575 1726867566.21216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 30575 1726867566.21220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30575 1726867566.21243: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30575 1726867566.21341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30575 1726867566.21373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 30575 1726867566.21388: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873bef0> <<< 30575 1726867566.21393: stdout chunk (state=3): >>>import '_tokenize' # <<< 30575 1726867566.21457: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873a9c0> <<< 30575 1726867566.21463: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873a750> <<< 30575 1726867566.21487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 30575 1726867566.21494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30575 1726867566.21564: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873ac90> <<< 30575 1726867566.21590: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8732720> <<< 30575 1726867566.21619: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f877ff80> <<< 30575 1726867566.21648: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87802f0> <<< 30575 1726867566.21680: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 30575 1726867566.21691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 30575 1726867566.21717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 30575 1726867566.21755: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8781d00> <<< 30575 1726867566.21760: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8781ac0> <<< 30575 1726867566.21776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 30575 1726867566.21810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30575 1726867566.21863: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.21865: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8784260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87823c0> <<< 30575 1726867566.21892: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30575 1726867566.21928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.21950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 30575 1726867566.21964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 30575 1726867566.21967: stdout chunk (state=3): >>>import '_string' # <<< 30575 1726867566.22010: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87879e0> <<< 30575 1726867566.22132: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87843b0> <<< 30575 1726867566.22193: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.22199: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8788830> <<< 30575 1726867566.22227: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.22233: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8788bf0> <<< 30575 1726867566.22272: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.22275: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8788cb0> <<< 30575 1726867566.22286: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8780380> <<< 30575 1726867566.22307: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 30575 1726867566.22336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 30575 1726867566.22349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 30575 1726867566.22383: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.22402: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.22408: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8610320> <<< 30575 1726867566.22547: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.22560: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f86112e0> <<< 30575 1726867566.22572: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f878aab0> <<< 30575 1726867566.22603: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.22615: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f878be60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f878a6f0> <<< 30575 1726867566.22629: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.22646: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 30575 1726867566.22659: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.22746: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.22840: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.22843: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 30575 1726867566.22883: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.22887: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.22894: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 30575 1726867566.22900: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.23020: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.23135: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.23663: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.24201: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30575 1726867566.24205: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 30575 1726867566.24217: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 30575 1726867566.24230: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30575 1726867566.24251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.24299: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.24305: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8619520> <<< 30575 1726867566.24384: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 30575 1726867566.24388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30575 1726867566.24400: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f861a300> <<< 30575 1726867566.24416: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8611130> <<< 30575 1726867566.24456: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 30575 1726867566.24471: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.24488: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.24511: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 30575 1726867566.24516: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.24664: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.24815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 30575 1726867566.24823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 30575 1726867566.24838: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f861a330> <<< 30575 1726867566.24847: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.25291: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.25730: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.25797: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.25874: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 30575 1726867566.25880: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.25932: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.25960: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 30575 1726867566.25968: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26039: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26122: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 30575 1726867566.26129: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26158: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 30575 1726867566.26164: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26209: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26242: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30575 1726867566.26259: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26481: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30575 1726867566.26770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30575 1726867566.26786: stdout chunk (state=3): >>>import '_ast' # <<< 30575 1726867566.26846: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f861b560> <<< 30575 1726867566.26864: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.26934: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27014: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 30575 1726867566.27025: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 30575 1726867566.27031: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 30575 1726867566.27052: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27091: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27136: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 30575 1726867566.27141: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27188: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27233: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27290: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27357: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30575 1726867566.27398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.27479: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.27485: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8626090> <<< 30575 1726867566.27517: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86217c0> <<< 30575 1726867566.27547: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 30575 1726867566.27556: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 30575 1726867566.27570: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27626: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27690: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.27720: stdout chunk (state=3): >>># zipimport: zlib available<<< 30575 1726867566.27725: stdout chunk (state=3): >>> <<< 30575 1726867566.27767: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 30575 1726867566.27771: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.27788: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30575 1726867566.27813: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 30575 1726867566.27832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30575 1726867566.27892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30575 1726867566.27908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 30575 1726867566.27930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30575 1726867566.27981: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f870ea50> <<< 30575 1726867566.28025: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8dc2720> <<< 30575 1726867566.28107: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8626270> <<< 30575 1726867566.28118: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8625e20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 30575 1726867566.28127: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28155: stdout chunk (state=3): >>># zipimport: zlib available<<< 30575 1726867566.28160: stdout chunk (state=3): >>> <<< 30575 1726867566.28197: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 30575 1726867566.28200: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 30575 1726867566.28241: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 30575 1726867566.28267: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28270: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 30575 1726867566.28291: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28349: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28408: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28435: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28451: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28496: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28539: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28575: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28608: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 30575 1726867566.28626: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28695: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28767: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28790: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.28829: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 30575 1726867566.28834: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.29012: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.29181: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.29227: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.29282: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 30575 1726867566.29285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.29303: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 30575 1726867566.29324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 30575 1726867566.29337: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 30575 1726867566.29363: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 30575 1726867566.29381: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b6180> <<< 30575 1726867566.29409: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 30575 1726867566.29418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 30575 1726867566.29442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 30575 1726867566.29481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 30575 1726867566.29508: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 30575 1726867566.29521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 30575 1726867566.29545: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825c140> <<< 30575 1726867566.29585: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.29588: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f825c4a0> <<< 30575 1726867566.29645: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f869c770> <<< 30575 1726867566.29654: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b6cc0> <<< 30575 1726867566.29706: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b48f0> <<< 30575 1726867566.29711: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b44a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 30575 1726867566.29774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 30575 1726867566.29816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 30575 1726867566.29820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 30575 1726867566.29883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 30575 1726867566.29887: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f825f3e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825ec90> <<< 30575 1726867566.29919: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f825ee70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825e0c0> <<< 30575 1726867566.29938: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 30575 1726867566.30048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 30575 1726867566.30075: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825f590> <<< 30575 1726867566.30106: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 30575 1726867566.30138: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f82a2090> <<< 30575 1726867566.30167: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825fce0> <<< 30575 1726867566.30205: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b44d0> import 'ansible.module_utils.facts.timeout' # <<< 30575 1726867566.30260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available <<< 30575 1726867566.30264: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 30575 1726867566.30400: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30422: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 30575 1726867566.30451: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30501: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 30575 1726867566.30525: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30530: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 30575 1726867566.30547: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30576: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30608: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 30575 1726867566.30614: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30666: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 30575 1726867566.30728: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30766: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30813: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 30575 1726867566.30819: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30881: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30938: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.30999: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.31054: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 30575 1726867566.31063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 30575 1726867566.31069: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.31547: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.31972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 30575 1726867566.31987: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32038: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32095: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32127: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32163: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 30575 1726867566.32171: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 30575 1726867566.32182: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32216: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32248: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 30575 1726867566.32257: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32316: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 30575 1726867566.32400: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32419: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 30575 1726867566.32527: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 30575 1726867566.32628: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82a3d10> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82a2e40> import 'ansible.module_utils.facts.system.local' # <<< 30575 1726867566.32902: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.32970: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 30575 1726867566.33050: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33142: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 30575 1726867566.33304: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 30575 1726867566.33395: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33436: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33480: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 30575 1726867566.33523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 30575 1726867566.33594: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.33659: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f82ea450> <<< 30575 1726867566.33855: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82da210> import 'ansible.module_utils.facts.system.python' # <<< 30575 1726867566.33858: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33915: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.33982: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 30575 1726867566.33985: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34063: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34148: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34260: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34405: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 30575 1726867566.34429: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34457: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 30575 1726867566.34511: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34553: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34603: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 30575 1726867566.34656: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.34713: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f82fe030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82fdd30> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 30575 1726867566.34716: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 30575 1726867566.34764: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 30575 1726867566.34814: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.34968: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 30575 1726867566.35225: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35324: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35357: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 30575 1726867566.35453: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35457: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35468: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35598: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 30575 1726867566.35766: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.35873: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.36000: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 30575 1726867566.36026: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.36038: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.36070: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.36606: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 30575 1726867566.37124: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37225: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 30575 1726867566.37349: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37431: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 30575 1726867566.37690: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 30575 1726867566.37861: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37889: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 30575 1726867566.37925: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.37974: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 30575 1726867566.37979: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38073: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38168: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38367: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 30575 1726867566.38592: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38620: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 30575 1726867566.38696: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 30575 1726867566.38798: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 30575 1726867566.38891: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38928: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 30575 1726867566.38931: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.38991: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 30575 1726867566.39061: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39112: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39174: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 30575 1726867566.39178: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39438: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 30575 1726867566.39703: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39754: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 30575 1726867566.39825: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39858: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 30575 1726867566.39935: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.39979: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 30575 1726867566.39982: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40016: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40054: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 30575 1726867566.40057: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40134: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 30575 1726867566.40232: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40257: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 30575 1726867566.40298: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 30575 1726867566.40394: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40398: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40445: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40493: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40560: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 30575 1726867566.40663: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40704: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.40757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 30575 1726867566.40952: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41152: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 30575 1726867566.41163: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41196: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 30575 1726867566.41261: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41298: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41356: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 30575 1726867566.41438: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 30575 1726867566.41540: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41617: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.41710: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 30575 1726867566.41792: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.42724: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 30575 1726867566.42728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 30575 1726867566.42762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 30575 1726867566.42765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 30575 1726867566.42805: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f80ffe60> <<< 30575 1726867566.42816: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f80fee10> <<< 30575 1726867566.42861: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f80fc710> <<< 30575 1726867566.43242: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "26", "second": "06", "epoch": "1726867566", "epoch_int": "1726867566", "date": "2024-09-20", "time": "17:26:06", "iso8601_micro": "2024-09-20T21:26:06.424999Z", "iso8601": "2024-09-20T21:26:06Z", "iso8601_basic": "20240920T172606424999", "iso8601_basic_short": "20240920T172606", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30575 1726867566.43783: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 30575 1726867566.43882: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 30575 1726867566.43886: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 30575 1726867566.43889: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 30575 1726867566.43891: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 30575 1726867566.43894: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 30575 1726867566.43946: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 30575 1726867566.43950: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 30575 1726867566.43995: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process <<< 30575 1726867566.44001: stdout chunk (state=3): >>># destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context <<< 30575 1726867566.44034: stdout chunk (state=3): >>># cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl <<< 30575 1726867566.44062: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd <<< 30575 1726867566.44093: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd <<< 30575 1726867566.44107: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 30575 1726867566.44414: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30575 1726867566.44441: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 30575 1726867566.44482: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 30575 1726867566.44495: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 30575 1726867566.44526: stdout chunk (state=3): >>># destroy ntpath <<< 30575 1726867566.44555: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 30575 1726867566.44602: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 30575 1726867566.44613: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 30575 1726867566.44649: stdout chunk (state=3): >>># destroy selinux <<< 30575 1726867566.44660: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 30575 1726867566.44715: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 30575 1726867566.44743: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 30575 1726867566.44760: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 30575 1726867566.44795: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 30575 1726867566.44809: stdout chunk (state=3): >>># destroy _ssl <<< 30575 1726867566.44849: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 30575 1726867566.44873: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 30575 1726867566.44898: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 30575 1726867566.44917: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 30575 1726867566.44975: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 30575 1726867566.44992: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 30575 1726867566.45032: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 30575 1726867566.45081: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 30575 1726867566.45097: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 30575 1726867566.45109: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30575 1726867566.45244: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 30575 1726867566.45374: stdout chunk (state=3): >>># destroy _collections <<< 30575 1726867566.45408: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 30575 1726867566.45411: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 30575 1726867566.45418: stdout chunk (state=3): >>># destroy _typing <<< 30575 1726867566.45422: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 30575 1726867566.45428: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30575 1726867566.45506: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 30575 1726867566.45552: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 30575 1726867566.45615: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 30575 1726867566.45618: stdout chunk (state=3): >>># clear sys.audit hooks <<< 30575 1726867566.45992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867566.45995: stdout chunk (state=3): >>><<< 30575 1726867566.45997: stderr chunk (state=3): >>><<< 30575 1726867566.46194: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f92184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f91e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f921aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f902d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f902dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f906be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f906bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9083b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9081280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9069040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9082150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90c0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f90f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f90f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9066de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9110710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f9111df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f9112c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f91132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f91121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f9113d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f91134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e1bbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e446b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e44410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e446e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e45010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8e45a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e448c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e19d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e46e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e45b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f90fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e6f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e93530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8ef4290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8ef69f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8ef43b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8eb92e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8cfd3a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e92360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8e47d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f19f8cfd640> # zipimport: found 103 names in '/tmp/ansible_setup_payload_v9_f8m04/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d67080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d45f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d45100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d64f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8d96ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d96840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d96150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d965a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d67d10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8d97830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8d97a70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8d97fb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f872dd60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f872f4a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87302f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8731490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8733f50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8738290> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8732240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873bef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873a9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873a750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f873ac90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8732720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f877ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87802f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8781d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8781ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8784260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87823c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87879e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f87843b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8788830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8788bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8788cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8780380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8610320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f86112e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f878aab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f878be60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f878a6f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8619520> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f861a300> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8611130> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f861a330> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f861b560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f8626090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86217c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f870ea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8dc2720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8626270> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f8625e20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b6180> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825c140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f825c4a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f869c770> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b6cc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b48f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b44a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f825f3e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825ec90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f825ee70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825e0c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825f590> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f82a2090> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f825fce0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f86b44d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82a3d10> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82a2e40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f82ea450> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82da210> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f82fe030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f82fdd30> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19f80ffe60> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f80fee10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19f80fc710> {"ansible_facts": {"ansible_lsb": {}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "26", "second": "06", "epoch": "1726867566", "epoch_int": "1726867566", "date": "2024-09-20", "time": "17:26:06", "iso8601_micro": "2024-09-20T21:26:06.424999Z", "iso8601": "2024-09-20T21:26:06Z", "iso8601_basic": "20240920T172606424999", "iso8601_basic_short": "20240920T172606", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 30575 1726867566.47137: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867566.47140: _low_level_execute_command(): starting 30575 1726867566.47143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867565.9900854-30638-37395391382857/ > /dev/null 2>&1 && sleep 0' 30575 1726867566.47145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867566.47148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.47150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.47152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867566.47154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867566.47156: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867566.47159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.47161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867566.47163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867566.47165: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867566.47184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867566.47193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.47241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867566.47245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.47254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.47295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.49127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.49149: stderr chunk (state=3): >>><<< 30575 1726867566.49152: stdout chunk (state=3): >>><<< 30575 1726867566.49165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867566.49171: handler run complete 30575 1726867566.49200: variable 'ansible_facts' from source: unknown 30575 1726867566.49239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867566.49309: variable 'ansible_facts' from source: unknown 30575 1726867566.49342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867566.49376: attempt loop complete, returning result 30575 1726867566.49382: _execute() done 30575 1726867566.49384: dumping result to json 30575 1726867566.49392: done dumping result, returning 30575 1726867566.49400: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-e081-a588-00000000002c] 30575 1726867566.49404: sending task result for task 0affcac9-a3a5-e081-a588-00000000002c 30575 1726867566.49527: done sending task result for task 0affcac9-a3a5-e081-a588-00000000002c 30575 1726867566.49529: WORKER PROCESS EXITING ok: [managed_node3] 30575 1726867566.49622: no more pending results, returning what we have 30575 1726867566.49625: results queue empty 30575 1726867566.49626: checking for any_errors_fatal 30575 1726867566.49627: done checking for any_errors_fatal 30575 1726867566.49628: checking for max_fail_percentage 30575 1726867566.49629: done checking for max_fail_percentage 30575 1726867566.49630: checking to see if all hosts have failed and the running result is not ok 30575 1726867566.49631: done checking to see if all hosts have failed 30575 1726867566.49632: getting the remaining hosts for this loop 30575 1726867566.49633: done getting the remaining hosts for this loop 30575 1726867566.49637: getting the next task for host managed_node3 30575 1726867566.49647: done getting next task for host managed_node3 30575 1726867566.49649: ^ task is: TASK: Check if system is ostree 30575 1726867566.49651: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867566.49654: getting variables 30575 1726867566.49655: in VariableManager get_vars() 30575 1726867566.49684: Calling all_inventory to load vars for managed_node3 30575 1726867566.49686: Calling groups_inventory to load vars for managed_node3 30575 1726867566.49689: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867566.49698: Calling all_plugins_play to load vars for managed_node3 30575 1726867566.49700: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867566.49703: Calling groups_plugins_play to load vars for managed_node3 30575 1726867566.49843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867566.49954: done with get_vars() 30575 1726867566.49961: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:26:06 -0400 (0:00:00.596) 0:00:01.877 ****** 30575 1726867566.50033: entering _queue_task() for managed_node3/stat 30575 1726867566.50330: worker is 1 (out of 1 available) 30575 1726867566.50342: exiting _queue_task() for managed_node3/stat 30575 1726867566.50359: done queuing things up, now waiting for results queue to drain 30575 1726867566.50361: waiting for pending results... 30575 1726867566.50574: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 30575 1726867566.50636: in run() - task 0affcac9-a3a5-e081-a588-00000000002e 30575 1726867566.50646: variable 'ansible_search_path' from source: unknown 30575 1726867566.50649: variable 'ansible_search_path' from source: unknown 30575 1726867566.50676: calling self._execute() 30575 1726867566.50731: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867566.50734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867566.50743: variable 'omit' from source: magic vars 30575 1726867566.51143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867566.51402: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867566.51405: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867566.51407: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867566.51444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867566.51520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867566.51544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867566.51572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867566.51600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867566.51719: Evaluated conditional (not __network_is_ostree is defined): True 30575 1726867566.51725: variable 'omit' from source: magic vars 30575 1726867566.51760: variable 'omit' from source: magic vars 30575 1726867566.51845: variable 'omit' from source: magic vars 30575 1726867566.51848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867566.51873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867566.51899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867566.51921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867566.51940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867566.51986: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867566.51996: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867566.52064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867566.52124: Set connection var ansible_pipelining to False 30575 1726867566.52135: Set connection var ansible_shell_type to sh 30575 1726867566.52150: Set connection var ansible_shell_executable to /bin/sh 30575 1726867566.52169: Set connection var ansible_timeout to 10 30575 1726867566.52185: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867566.52196: Set connection var ansible_connection to ssh 30575 1726867566.52236: variable 'ansible_shell_executable' from source: unknown 30575 1726867566.52284: variable 'ansible_connection' from source: unknown 30575 1726867566.52287: variable 'ansible_module_compression' from source: unknown 30575 1726867566.52290: variable 'ansible_shell_type' from source: unknown 30575 1726867566.52292: variable 'ansible_shell_executable' from source: unknown 30575 1726867566.52294: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867566.52296: variable 'ansible_pipelining' from source: unknown 30575 1726867566.52298: variable 'ansible_timeout' from source: unknown 30575 1726867566.52300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867566.52645: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867566.52650: variable 'omit' from source: magic vars 30575 1726867566.52652: starting attempt loop 30575 1726867566.52655: running the handler 30575 1726867566.52656: _low_level_execute_command(): starting 30575 1726867566.52659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867566.53242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867566.53263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.53293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867566.53332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.53349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867566.53441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.53462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.53546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.55140: stdout chunk (state=3): >>>/root <<< 30575 1726867566.55299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.55302: stdout chunk (state=3): >>><<< 30575 1726867566.55304: stderr chunk (state=3): >>><<< 30575 1726867566.55321: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867566.55428: _low_level_execute_command(): starting 30575 1726867566.55432: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630 `" && echo ansible-tmp-1726867566.5533974-30667-67107394169630="` echo /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630 `" ) && sleep 0' 30575 1726867566.55943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867566.55957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.55975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.55998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867566.56016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867566.56030: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867566.56044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.56063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867566.56150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.56175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.56252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.58135: stdout chunk (state=3): >>>ansible-tmp-1726867566.5533974-30667-67107394169630=/root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630 <<< 30575 1726867566.58259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.58296: stderr chunk (state=3): >>><<< 30575 1726867566.58306: stdout chunk (state=3): >>><<< 30575 1726867566.58331: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867566.5533974-30667-67107394169630=/root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867566.58393: variable 'ansible_module_compression' from source: unknown 30575 1726867566.58460: ANSIBALLZ: Using lock for stat 30575 1726867566.58473: ANSIBALLZ: Acquiring lock 30575 1726867566.58485: ANSIBALLZ: Lock acquired: 140240646920128 30575 1726867566.58493: ANSIBALLZ: Creating module 30575 1726867566.71551: ANSIBALLZ: Writing module into payload 30575 1726867566.71683: ANSIBALLZ: Writing module 30575 1726867566.71687: ANSIBALLZ: Renaming module 30575 1726867566.71690: ANSIBALLZ: Done creating module 30575 1726867566.71710: variable 'ansible_facts' from source: unknown 30575 1726867566.71790: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/AnsiballZ_stat.py 30575 1726867566.72021: Sending initial data 30575 1726867566.72030: Sent initial data (152 bytes) 30575 1726867566.72594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.72653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867566.72670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.72694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.72767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.74411: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867566.74462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867566.74500: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp85_wa9x1 /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/AnsiballZ_stat.py <<< 30575 1726867566.74539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/AnsiballZ_stat.py" <<< 30575 1726867566.74572: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp85_wa9x1" to remote "/root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/AnsiballZ_stat.py" <<< 30575 1726867566.75372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.75531: stdout chunk (state=3): >>><<< 30575 1726867566.75535: stderr chunk (state=3): >>><<< 30575 1726867566.75541: done transferring module to remote 30575 1726867566.75544: _low_level_execute_command(): starting 30575 1726867566.75546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/ /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/AnsiballZ_stat.py && sleep 0' 30575 1726867566.76324: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.76392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867566.76408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867566.76419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.76491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.78327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867566.78339: stdout chunk (state=3): >>><<< 30575 1726867566.78351: stderr chunk (state=3): >>><<< 30575 1726867566.78371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867566.78383: _low_level_execute_command(): starting 30575 1726867566.78393: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/AnsiballZ_stat.py && sleep 0' 30575 1726867566.79070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867566.79088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867566.79104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867566.79244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867566.79248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867566.79269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867566.79367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867566.81522: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30575 1726867566.81547: stdout chunk (state=3): >>>import _imp # builtin <<< 30575 1726867566.81578: stdout chunk (state=3): >>>import '_thread' # <<< 30575 1726867566.81601: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 30575 1726867566.81649: stdout chunk (state=3): >>>import '_io' # <<< 30575 1726867566.81666: stdout chunk (state=3): >>>import 'marshal' # <<< 30575 1726867566.81697: stdout chunk (state=3): >>>import 'posix' # <<< 30575 1726867566.81734: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 30575 1726867566.81759: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 30575 1726867566.81768: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 30575 1726867566.81821: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.81855: stdout chunk (state=3): >>>import '_codecs' # <<< 30575 1726867566.81858: stdout chunk (state=3): >>>import 'codecs' # <<< 30575 1726867566.81902: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30575 1726867566.81926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 30575 1726867566.81963: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ce84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6cb7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ceaa50> <<< 30575 1726867566.81997: stdout chunk (state=3): >>>import '_signal' # <<< 30575 1726867566.82021: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 30575 1726867566.82038: stdout chunk (state=3): >>>import 'io' # <<< 30575 1726867566.82073: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 30575 1726867566.82154: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30575 1726867566.82195: stdout chunk (state=3): >>>import 'genericpath' # <<< 30575 1726867566.82209: stdout chunk (state=3): >>>import 'posixpath' # import 'os' # <<< 30575 1726867566.82242: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 30575 1726867566.82267: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 30575 1726867566.82314: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 30575 1726867566.82324: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6a99130> <<< 30575 1726867566.82397: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.82422: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6a99fa0> import 'site' # <<< 30575 1726867566.82453: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30575 1726867566.82700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 30575 1726867566.82708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30575 1726867566.82745: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30575 1726867566.82797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30575 1726867566.82800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30575 1726867566.82837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad7e60> <<< 30575 1726867566.82875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 30575 1726867566.82882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30575 1726867566.82922: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad7f20> <<< 30575 1726867566.82928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30575 1726867566.82952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 30575 1726867566.82975: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30575 1726867566.83040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.83043: stdout chunk (state=3): >>>import 'itertools' # <<< 30575 1726867566.83072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b0f890> <<< 30575 1726867566.83112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 30575 1726867566.83128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b0ff20> import '_collections' # <<< 30575 1726867566.83175: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6aefb30> import '_functools' # <<< 30575 1726867566.83212: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6aed250> <<< 30575 1726867566.83298: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad5010> <<< 30575 1726867566.83347: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30575 1726867566.83350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 30575 1726867566.83380: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30575 1726867566.83412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 30575 1726867566.83438: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30575 1726867566.83467: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b2f800> <<< 30575 1726867566.83481: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b2e450> <<< 30575 1726867566.83531: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6aee120> <<< 30575 1726867566.83535: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b2ccb0> <<< 30575 1726867566.83585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b64860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad4290> <<< 30575 1726867566.83618: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30575 1726867566.83672: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b64d10> <<< 30575 1726867566.83681: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b64bc0> <<< 30575 1726867566.83710: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b64fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad2db0> <<< 30575 1726867566.83757: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 30575 1726867566.83773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30575 1726867566.83816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30575 1726867566.83846: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b656a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b65370> import 'importlib.machinery' # <<< 30575 1726867566.83867: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 30575 1726867566.83901: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b665a0> <<< 30575 1726867566.83919: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 30575 1726867566.83942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30575 1726867566.83965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30575 1726867566.84002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 30575 1726867566.84031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7c7a0> import 'errno' # <<< 30575 1726867566.84062: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.84090: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b7de80> <<< 30575 1726867566.84103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30575 1726867566.84143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 30575 1726867566.84156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7ed20> <<< 30575 1726867566.84192: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b7f320> <<< 30575 1726867566.84219: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7e270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 30575 1726867566.84234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30575 1726867566.84279: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.84293: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b7fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7f4d0> <<< 30575 1726867566.84332: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b66510> <<< 30575 1726867566.84366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 30575 1726867566.84379: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 30575 1726867566.84412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 30575 1726867566.84427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30575 1726867566.84451: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e68fbbf0> <<< 30575 1726867566.84486: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 30575 1726867566.84516: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6924740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69244a0> <<< 30575 1726867566.84540: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6924680> <<< 30575 1726867566.84574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 30575 1726867566.84588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30575 1726867566.84645: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.84776: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6924fe0> <<< 30575 1726867566.84889: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6925910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69248c0> <<< 30575 1726867566.84919: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e68f9d90> <<< 30575 1726867566.84942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 30575 1726867566.84972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 30575 1726867566.84998: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 30575 1726867566.85015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6926d20> <<< 30575 1726867566.85058: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6925a60> <<< 30575 1726867566.85074: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b66750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30575 1726867566.85141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.85153: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30575 1726867566.85201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30575 1726867566.85212: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e694f080> <<< 30575 1726867566.85280: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30575 1726867566.85307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 30575 1726867566.85322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30575 1726867566.85363: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6973440> <<< 30575 1726867566.85389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30575 1726867566.85437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30575 1726867566.85483: stdout chunk (state=3): >>>import 'ntpath' # <<< 30575 1726867566.85515: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69d4230> <<< 30575 1726867566.85549: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 30575 1726867566.85563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30575 1726867566.85589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30575 1726867566.85629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30575 1726867566.85710: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69d6990> <<< 30575 1726867566.85786: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69d4350> <<< 30575 1726867566.85839: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69a1250> <<< 30575 1726867566.85872: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6315310> <<< 30575 1726867566.85887: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6972240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6927c50> <<< 30575 1726867566.85986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30575 1726867566.86009: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb4e63155b0> <<< 30575 1726867566.86160: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_j1c_lb22/ansible_stat_payload.zip' <<< 30575 1726867566.86173: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.86280: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.86318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 30575 1726867566.86334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30575 1726867566.86366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30575 1726867566.86438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 30575 1726867566.86481: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e636afc0> <<< 30575 1726867566.86484: stdout chunk (state=3): >>>import '_typing' # <<< 30575 1726867566.86673: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6349eb0> <<< 30575 1726867566.86676: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63490a0> <<< 30575 1726867566.86696: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.86711: stdout chunk (state=3): >>>import 'ansible' # <<< 30575 1726867566.86743: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30575 1726867566.86773: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 30575 1726867566.86775: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.88172: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.89255: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63692b0> <<< 30575 1726867566.89287: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.89328: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 30575 1726867566.89368: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30575 1726867566.89371: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.89407: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6396960> <<< 30575 1726867566.89410: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63966f0> <<< 30575 1726867566.89444: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6396030> <<< 30575 1726867566.89472: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30575 1726867566.89514: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6396480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e636bc50> <<< 30575 1726867566.89540: stdout chunk (state=3): >>>import 'atexit' # <<< 30575 1726867566.89562: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6397710> <<< 30575 1726867566.89595: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6397950> <<< 30575 1726867566.89606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 30575 1726867566.89655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 30575 1726867566.89658: stdout chunk (state=3): >>>import '_locale' # <<< 30575 1726867566.89705: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6397e90> import 'pwd' # <<< 30575 1726867566.89733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30575 1726867566.89754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30575 1726867566.89815: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6201c70> <<< 30575 1726867566.89832: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6203890> <<< 30575 1726867566.89858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30575 1726867566.89869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 30575 1726867566.89903: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6204260> <<< 30575 1726867566.89950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30575 1726867566.89954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 30575 1726867566.89973: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6205400> <<< 30575 1726867566.89986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 30575 1726867566.90026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 30575 1726867566.90044: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30575 1726867566.90093: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6207ec0> <<< 30575 1726867566.90138: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6ad7d10> <<< 30575 1726867566.90171: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6206180> <<< 30575 1726867566.90174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30575 1726867566.90193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 30575 1726867566.90220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30575 1726867566.90244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30575 1726867566.90271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30575 1726867566.90298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620fe90> <<< 30575 1726867566.90322: stdout chunk (state=3): >>>import '_tokenize' # <<< 30575 1726867566.90401: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620e990> <<< 30575 1726867566.90416: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620e6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30575 1726867566.90495: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620ec30> <<< 30575 1726867566.90518: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6206690> <<< 30575 1726867566.90545: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6257f50> <<< 30575 1726867566.90585: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6258260> <<< 30575 1726867566.90613: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 30575 1726867566.90638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 30575 1726867566.90698: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 30575 1726867566.90702: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6259d30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6259af0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 30575 1726867566.90805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30575 1726867566.90857: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.90876: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e625c2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e625a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30575 1726867566.90926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.90966: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 30575 1726867566.90969: stdout chunk (state=3): >>>import '_string' # <<< 30575 1726867566.91006: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e625faa0> <<< 30575 1726867566.91126: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e625c470> <<< 30575 1726867566.91189: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e62607d0> <<< 30575 1726867566.91220: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6260a40> <<< 30575 1726867566.91274: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6260dd0> <<< 30575 1726867566.91301: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6258440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 30575 1726867566.91327: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 30575 1726867566.91346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 30575 1726867566.91382: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.91409: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e60ec350> <<< 30575 1726867566.91554: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30575 1726867566.91565: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e60ed460> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6262b10> <<< 30575 1726867566.91604: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6263e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6262750> # zipimport: zlib available <<< 30575 1726867566.91645: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 30575 1726867566.91648: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.91740: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.91887: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.91890: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 30575 1726867566.91895: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available<<< 30575 1726867566.91905: stdout chunk (state=3): >>> <<< 30575 1726867566.92021: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.92142: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.92674: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.93214: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 30575 1726867566.93239: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 30575 1726867566.93262: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30575 1726867566.93273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.93314: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e60f5790> <<< 30575 1726867566.93404: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30575 1726867566.93416: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f6ba0> <<< 30575 1726867566.93442: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60ed6a0> <<< 30575 1726867566.93482: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 30575 1726867566.93518: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.93522: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 30575 1726867566.93551: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.93679: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.93848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 30575 1726867566.93851: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f6cc0> <<< 30575 1726867566.93872: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.94318: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.94752: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.94827: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.94908: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 30575 1726867566.94955: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.94993: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 30575 1726867566.95004: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.95063: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.95157: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 30575 1726867566.95161: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.95189: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 30575 1726867566.95237: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.95271: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30575 1726867566.95289: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.95504: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.95739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30575 1726867566.95798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30575 1726867566.95809: stdout chunk (state=3): >>>import '_ast' # <<< 30575 1726867566.95878: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f7800> <<< 30575 1726867566.95893: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.95950: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96035: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 30575 1726867566.96064: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96106: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96156: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 30575 1726867566.96199: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96259: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96299: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96376: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30575 1726867566.96402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.96489: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6102360> <<< 30575 1726867566.96539: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60fed50> <<< 30575 1726867566.96574: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 30575 1726867566.96579: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96629: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96688: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96718: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.96769: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30575 1726867566.96815: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30575 1726867566.96818: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 30575 1726867566.96840: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30575 1726867566.96892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30575 1726867566.96920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30575 1726867566.97040: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e61eec90> <<< 30575 1726867566.97085: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63ce960> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e61024e0> <<< 30575 1726867566.97302: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f85f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 30575 1726867566.97386: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.97571: stdout chunk (state=3): >>># zipimport: zlib available <<< 30575 1726867566.97695: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 30575 1726867566.98010: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 30575 1726867566.98050: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack <<< 30575 1726867566.98078: stdout chunk (state=3): >>># destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 30575 1726867566.98140: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 30575 1726867566.98147: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text<<< 30575 1726867566.98175: stdout chunk (state=3): >>> # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 30575 1726867566.98403: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30575 1726867566.98455: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma <<< 30575 1726867566.98515: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 30575 1726867566.98525: stdout chunk (state=3): >>># destroy ntpath <<< 30575 1726867566.98565: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select <<< 30575 1726867566.98628: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 30575 1726867566.98631: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 30575 1726867566.98641: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 30575 1726867566.98710: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves <<< 30575 1726867566.98738: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 30575 1726867566.98781: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 30575 1726867566.98819: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 30575 1726867566.98846: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 30575 1726867566.98856: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30575 1726867566.98985: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 30575 1726867566.99028: stdout chunk (state=3): >>># destroy _collections <<< 30575 1726867566.99054: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 30575 1726867566.99101: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse <<< 30575 1726867566.99114: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30575 1726867566.99228: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 30575 1726867566.99232: stdout chunk (state=3): >>># destroy time <<< 30575 1726867566.99299: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 30575 1726867566.99302: stdout chunk (state=3): >>># destroy itertools <<< 30575 1726867566.99321: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 30575 1726867566.99693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867566.99696: stdout chunk (state=3): >>><<< 30575 1726867566.99699: stderr chunk (state=3): >>><<< 30575 1726867566.99826: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ce84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6cb7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ceaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6a99130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6a99fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad7e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad7f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b0f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b0ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6aefb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6aed250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad5010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b2f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b2e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6aee120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b2ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b64860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad4290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b64d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b64bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b64fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6ad2db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b656a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b65370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b665a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b7de80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7ed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b7f320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7e270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6b7fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b7f4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b66510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e68fbbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6924740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69244a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6924680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6924fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6925910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69248c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e68f9d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6926d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6925a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6b66750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e694f080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6973440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69d4230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69d6990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69d4350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e69a1250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6315310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6972240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6927c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb4e63155b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_j1c_lb22/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e636afc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6349eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63490a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63692b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6396960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63966f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6396030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6396480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e636bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6397710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6397950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6397e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6201c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6203890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6204260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6205400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6207ec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6ad7d10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6206180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620fe90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620e990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620e6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e620ec30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6206690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6257f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6258260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6259d30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6259af0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e625c2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e625a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e625faa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e625c470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e62607d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6260a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6260dd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6258440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e60ec350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e60ed460> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6262b10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6263e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e6262750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e60f5790> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f6ba0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60ed6a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f6cc0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f7800> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4e6102360> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60fed50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e61eec90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e63ce960> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e61024e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4e60f85f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 30575 1726867567.00532: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867567.00535: _low_level_execute_command(): starting 30575 1726867567.00537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867566.5533974-30667-67107394169630/ > /dev/null 2>&1 && sleep 0' 30575 1726867567.00893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867567.00896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867567.00907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867567.00988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867567.00998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867567.01019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867567.01099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867567.03087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867567.03090: stdout chunk (state=3): >>><<< 30575 1726867567.03093: stderr chunk (state=3): >>><<< 30575 1726867567.03095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867567.03098: handler run complete 30575 1726867567.03101: attempt loop complete, returning result 30575 1726867567.03103: _execute() done 30575 1726867567.03105: dumping result to json 30575 1726867567.03107: done dumping result, returning 30575 1726867567.03109: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affcac9-a3a5-e081-a588-00000000002e] 30575 1726867567.03111: sending task result for task 0affcac9-a3a5-e081-a588-00000000002e ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867567.03340: no more pending results, returning what we have 30575 1726867567.03344: results queue empty 30575 1726867567.03345: checking for any_errors_fatal 30575 1726867567.03352: done checking for any_errors_fatal 30575 1726867567.03353: checking for max_fail_percentage 30575 1726867567.03355: done checking for max_fail_percentage 30575 1726867567.03356: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.03357: done checking to see if all hosts have failed 30575 1726867567.03357: getting the remaining hosts for this loop 30575 1726867567.03359: done getting the remaining hosts for this loop 30575 1726867567.03362: getting the next task for host managed_node3 30575 1726867567.03369: done getting next task for host managed_node3 30575 1726867567.03371: ^ task is: TASK: Set flag to indicate system is ostree 30575 1726867567.03374: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.03379: getting variables 30575 1726867567.03381: in VariableManager get_vars() 30575 1726867567.03413: Calling all_inventory to load vars for managed_node3 30575 1726867567.03416: Calling groups_inventory to load vars for managed_node3 30575 1726867567.03420: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.03434: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.03437: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.03441: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.03806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.04148: done with get_vars() 30575 1726867567.04159: done getting variables 30575 1726867567.04194: done sending task result for task 0affcac9-a3a5-e081-a588-00000000002e 30575 1726867567.04198: WORKER PROCESS EXITING 30575 1726867567.04284: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:26:07 -0400 (0:00:00.542) 0:00:02.420 ****** 30575 1726867567.04321: entering _queue_task() for managed_node3/set_fact 30575 1726867567.04325: Creating lock for set_fact 30575 1726867567.04619: worker is 1 (out of 1 available) 30575 1726867567.04633: exiting _queue_task() for managed_node3/set_fact 30575 1726867567.04759: done queuing things up, now waiting for results queue to drain 30575 1726867567.04761: waiting for pending results... 30575 1726867567.04912: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 30575 1726867567.05028: in run() - task 0affcac9-a3a5-e081-a588-00000000002f 30575 1726867567.05049: variable 'ansible_search_path' from source: unknown 30575 1726867567.05057: variable 'ansible_search_path' from source: unknown 30575 1726867567.05106: calling self._execute() 30575 1726867567.05179: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.05196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.05212: variable 'omit' from source: magic vars 30575 1726867567.05701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867567.06019: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867567.06079: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867567.06118: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867567.06158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867567.06253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867567.06291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867567.06325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867567.06358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867567.06486: Evaluated conditional (not __network_is_ostree is defined): True 30575 1726867567.06505: variable 'omit' from source: magic vars 30575 1726867567.06548: variable 'omit' from source: magic vars 30575 1726867567.06672: variable '__ostree_booted_stat' from source: set_fact 30575 1726867567.06737: variable 'omit' from source: magic vars 30575 1726867567.06765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867567.06799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867567.06832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867567.06854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867567.06883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867567.06908: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867567.06939: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.06943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.07045: Set connection var ansible_pipelining to False 30575 1726867567.07084: Set connection var ansible_shell_type to sh 30575 1726867567.07087: Set connection var ansible_shell_executable to /bin/sh 30575 1726867567.07089: Set connection var ansible_timeout to 10 30575 1726867567.07091: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867567.07093: Set connection var ansible_connection to ssh 30575 1726867567.07111: variable 'ansible_shell_executable' from source: unknown 30575 1726867567.07117: variable 'ansible_connection' from source: unknown 30575 1726867567.07125: variable 'ansible_module_compression' from source: unknown 30575 1726867567.07131: variable 'ansible_shell_type' from source: unknown 30575 1726867567.07136: variable 'ansible_shell_executable' from source: unknown 30575 1726867567.07153: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.07156: variable 'ansible_pipelining' from source: unknown 30575 1726867567.07157: variable 'ansible_timeout' from source: unknown 30575 1726867567.07159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.07249: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867567.07299: variable 'omit' from source: magic vars 30575 1726867567.07302: starting attempt loop 30575 1726867567.07304: running the handler 30575 1726867567.07307: handler run complete 30575 1726867567.07308: attempt loop complete, returning result 30575 1726867567.07310: _execute() done 30575 1726867567.07312: dumping result to json 30575 1726867567.07316: done dumping result, returning 30575 1726867567.07330: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-00000000002f] 30575 1726867567.07339: sending task result for task 0affcac9-a3a5-e081-a588-00000000002f 30575 1726867567.07608: done sending task result for task 0affcac9-a3a5-e081-a588-00000000002f 30575 1726867567.07611: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 30575 1726867567.07653: no more pending results, returning what we have 30575 1726867567.07655: results queue empty 30575 1726867567.07656: checking for any_errors_fatal 30575 1726867567.07660: done checking for any_errors_fatal 30575 1726867567.07660: checking for max_fail_percentage 30575 1726867567.07661: done checking for max_fail_percentage 30575 1726867567.07662: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.07663: done checking to see if all hosts have failed 30575 1726867567.07664: getting the remaining hosts for this loop 30575 1726867567.07665: done getting the remaining hosts for this loop 30575 1726867567.07667: getting the next task for host managed_node3 30575 1726867567.07675: done getting next task for host managed_node3 30575 1726867567.07679: ^ task is: TASK: Fix CentOS6 Base repo 30575 1726867567.07681: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.07684: getting variables 30575 1726867567.07685: in VariableManager get_vars() 30575 1726867567.07708: Calling all_inventory to load vars for managed_node3 30575 1726867567.07711: Calling groups_inventory to load vars for managed_node3 30575 1726867567.07714: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.07729: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.07732: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.07740: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.08005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.08202: done with get_vars() 30575 1726867567.08211: done getting variables 30575 1726867567.08329: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:26:07 -0400 (0:00:00.040) 0:00:02.461 ****** 30575 1726867567.08355: entering _queue_task() for managed_node3/copy 30575 1726867567.08779: worker is 1 (out of 1 available) 30575 1726867567.08786: exiting _queue_task() for managed_node3/copy 30575 1726867567.08796: done queuing things up, now waiting for results queue to drain 30575 1726867567.08798: waiting for pending results... 30575 1726867567.08835: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 30575 1726867567.09028: in run() - task 0affcac9-a3a5-e081-a588-000000000031 30575 1726867567.09031: variable 'ansible_search_path' from source: unknown 30575 1726867567.09035: variable 'ansible_search_path' from source: unknown 30575 1726867567.09037: calling self._execute() 30575 1726867567.09072: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.09086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.09097: variable 'omit' from source: magic vars 30575 1726867567.09530: variable 'ansible_distribution' from source: facts 30575 1726867567.09555: Evaluated conditional (ansible_distribution == 'CentOS'): True 30575 1726867567.09686: variable 'ansible_distribution_major_version' from source: facts 30575 1726867567.09697: Evaluated conditional (ansible_distribution_major_version == '6'): False 30575 1726867567.09704: when evaluation is False, skipping this task 30575 1726867567.09710: _execute() done 30575 1726867567.09717: dumping result to json 30575 1726867567.09727: done dumping result, returning 30575 1726867567.09737: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-e081-a588-000000000031] 30575 1726867567.09746: sending task result for task 0affcac9-a3a5-e081-a588-000000000031 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30575 1726867567.09994: no more pending results, returning what we have 30575 1726867567.09997: results queue empty 30575 1726867567.09998: checking for any_errors_fatal 30575 1726867567.10003: done checking for any_errors_fatal 30575 1726867567.10003: checking for max_fail_percentage 30575 1726867567.10005: done checking for max_fail_percentage 30575 1726867567.10006: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.10007: done checking to see if all hosts have failed 30575 1726867567.10007: getting the remaining hosts for this loop 30575 1726867567.10009: done getting the remaining hosts for this loop 30575 1726867567.10012: getting the next task for host managed_node3 30575 1726867567.10017: done getting next task for host managed_node3 30575 1726867567.10020: ^ task is: TASK: Include the task 'enable_epel.yml' 30575 1726867567.10026: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.10030: getting variables 30575 1726867567.10031: in VariableManager get_vars() 30575 1726867567.10060: Calling all_inventory to load vars for managed_node3 30575 1726867567.10062: Calling groups_inventory to load vars for managed_node3 30575 1726867567.10065: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.10076: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.10080: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.10083: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.10327: done sending task result for task 0affcac9-a3a5-e081-a588-000000000031 30575 1726867567.10330: WORKER PROCESS EXITING 30575 1726867567.10350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.10551: done with get_vars() 30575 1726867567.10559: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:26:07 -0400 (0:00:00.022) 0:00:02.484 ****** 30575 1726867567.10645: entering _queue_task() for managed_node3/include_tasks 30575 1726867567.10850: worker is 1 (out of 1 available) 30575 1726867567.10860: exiting _queue_task() for managed_node3/include_tasks 30575 1726867567.10870: done queuing things up, now waiting for results queue to drain 30575 1726867567.10872: waiting for pending results... 30575 1726867567.11096: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 30575 1726867567.11193: in run() - task 0affcac9-a3a5-e081-a588-000000000032 30575 1726867567.11208: variable 'ansible_search_path' from source: unknown 30575 1726867567.11214: variable 'ansible_search_path' from source: unknown 30575 1726867567.11258: calling self._execute() 30575 1726867567.11322: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.11336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.11462: variable 'omit' from source: magic vars 30575 1726867567.11867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867567.14169: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867567.14255: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867567.14301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867567.14346: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867567.14375: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867567.14463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867567.14500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867567.14541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867567.14632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867567.14636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867567.14718: variable '__network_is_ostree' from source: set_fact 30575 1726867567.14749: Evaluated conditional (not __network_is_ostree | d(false)): True 30575 1726867567.14759: _execute() done 30575 1726867567.14767: dumping result to json 30575 1726867567.14774: done dumping result, returning 30575 1726867567.14785: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-e081-a588-000000000032] 30575 1726867567.14794: sending task result for task 0affcac9-a3a5-e081-a588-000000000032 30575 1726867567.14911: done sending task result for task 0affcac9-a3a5-e081-a588-000000000032 30575 1726867567.14914: WORKER PROCESS EXITING 30575 1726867567.14974: no more pending results, returning what we have 30575 1726867567.14981: in VariableManager get_vars() 30575 1726867567.15014: Calling all_inventory to load vars for managed_node3 30575 1726867567.15017: Calling groups_inventory to load vars for managed_node3 30575 1726867567.15020: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.15033: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.15037: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.15040: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.15442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.15640: done with get_vars() 30575 1726867567.15647: variable 'ansible_search_path' from source: unknown 30575 1726867567.15648: variable 'ansible_search_path' from source: unknown 30575 1726867567.15686: we have included files to process 30575 1726867567.15687: generating all_blocks data 30575 1726867567.15689: done generating all_blocks data 30575 1726867567.15693: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30575 1726867567.15694: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30575 1726867567.15697: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30575 1726867567.16366: done processing included file 30575 1726867567.16368: iterating over new_blocks loaded from include file 30575 1726867567.16369: in VariableManager get_vars() 30575 1726867567.16384: done with get_vars() 30575 1726867567.16386: filtering new block on tags 30575 1726867567.16405: done filtering new block on tags 30575 1726867567.16407: in VariableManager get_vars() 30575 1726867567.16415: done with get_vars() 30575 1726867567.16416: filtering new block on tags 30575 1726867567.16428: done filtering new block on tags 30575 1726867567.16430: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 30575 1726867567.16435: extending task lists for all hosts with included blocks 30575 1726867567.16544: done extending task lists 30575 1726867567.16546: done processing included files 30575 1726867567.16547: results queue empty 30575 1726867567.16547: checking for any_errors_fatal 30575 1726867567.16550: done checking for any_errors_fatal 30575 1726867567.16550: checking for max_fail_percentage 30575 1726867567.16551: done checking for max_fail_percentage 30575 1726867567.16552: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.16553: done checking to see if all hosts have failed 30575 1726867567.16553: getting the remaining hosts for this loop 30575 1726867567.16554: done getting the remaining hosts for this loop 30575 1726867567.16557: getting the next task for host managed_node3 30575 1726867567.16561: done getting next task for host managed_node3 30575 1726867567.16563: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 30575 1726867567.16565: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.16567: getting variables 30575 1726867567.16568: in VariableManager get_vars() 30575 1726867567.16576: Calling all_inventory to load vars for managed_node3 30575 1726867567.16580: Calling groups_inventory to load vars for managed_node3 30575 1726867567.16583: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.16591: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.16599: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.16602: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.16893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.17091: done with get_vars() 30575 1726867567.17099: done getting variables 30575 1726867567.17168: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 30575 1726867567.17374: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:26:07 -0400 (0:00:00.067) 0:00:02.551 ****** 30575 1726867567.17417: entering _queue_task() for managed_node3/command 30575 1726867567.17419: Creating lock for command 30575 1726867567.17888: worker is 1 (out of 1 available) 30575 1726867567.17895: exiting _queue_task() for managed_node3/command 30575 1726867567.17905: done queuing things up, now waiting for results queue to drain 30575 1726867567.17906: waiting for pending results... 30575 1726867567.18037: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 30575 1726867567.18058: in run() - task 0affcac9-a3a5-e081-a588-00000000004c 30575 1726867567.18076: variable 'ansible_search_path' from source: unknown 30575 1726867567.18087: variable 'ansible_search_path' from source: unknown 30575 1726867567.18134: calling self._execute() 30575 1726867567.18242: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.18246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.18248: variable 'omit' from source: magic vars 30575 1726867567.18883: variable 'ansible_distribution' from source: facts 30575 1726867567.18888: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30575 1726867567.18967: variable 'ansible_distribution_major_version' from source: facts 30575 1726867567.19014: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30575 1726867567.19025: when evaluation is False, skipping this task 30575 1726867567.19227: _execute() done 30575 1726867567.19231: dumping result to json 30575 1726867567.19233: done dumping result, returning 30575 1726867567.19236: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0affcac9-a3a5-e081-a588-00000000004c] 30575 1726867567.19238: sending task result for task 0affcac9-a3a5-e081-a588-00000000004c 30575 1726867567.19306: done sending task result for task 0affcac9-a3a5-e081-a588-00000000004c 30575 1726867567.19309: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30575 1726867567.19382: no more pending results, returning what we have 30575 1726867567.19386: results queue empty 30575 1726867567.19387: checking for any_errors_fatal 30575 1726867567.19388: done checking for any_errors_fatal 30575 1726867567.19388: checking for max_fail_percentage 30575 1726867567.19390: done checking for max_fail_percentage 30575 1726867567.19391: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.19392: done checking to see if all hosts have failed 30575 1726867567.19392: getting the remaining hosts for this loop 30575 1726867567.19394: done getting the remaining hosts for this loop 30575 1726867567.19398: getting the next task for host managed_node3 30575 1726867567.19405: done getting next task for host managed_node3 30575 1726867567.19407: ^ task is: TASK: Install yum-utils package 30575 1726867567.19411: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.19416: getting variables 30575 1726867567.19418: in VariableManager get_vars() 30575 1726867567.19495: Calling all_inventory to load vars for managed_node3 30575 1726867567.19497: Calling groups_inventory to load vars for managed_node3 30575 1726867567.19501: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.19511: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.19513: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.19515: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.20043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.20458: done with get_vars() 30575 1726867567.20467: done getting variables 30575 1726867567.20633: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:26:07 -0400 (0:00:00.032) 0:00:02.584 ****** 30575 1726867567.20773: entering _queue_task() for managed_node3/package 30575 1726867567.20775: Creating lock for package 30575 1726867567.21197: worker is 1 (out of 1 available) 30575 1726867567.21214: exiting _queue_task() for managed_node3/package 30575 1726867567.21228: done queuing things up, now waiting for results queue to drain 30575 1726867567.21230: waiting for pending results... 30575 1726867567.21551: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 30575 1726867567.21555: in run() - task 0affcac9-a3a5-e081-a588-00000000004d 30575 1726867567.21557: variable 'ansible_search_path' from source: unknown 30575 1726867567.21560: variable 'ansible_search_path' from source: unknown 30575 1726867567.21581: calling self._execute() 30575 1726867567.21656: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.21670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.21690: variable 'omit' from source: magic vars 30575 1726867567.22084: variable 'ansible_distribution' from source: facts 30575 1726867567.22088: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30575 1726867567.22225: variable 'ansible_distribution_major_version' from source: facts 30575 1726867567.22236: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30575 1726867567.22302: when evaluation is False, skipping this task 30575 1726867567.22306: _execute() done 30575 1726867567.22308: dumping result to json 30575 1726867567.22311: done dumping result, returning 30575 1726867567.22314: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affcac9-a3a5-e081-a588-00000000004d] 30575 1726867567.22316: sending task result for task 0affcac9-a3a5-e081-a588-00000000004d 30575 1726867567.22385: done sending task result for task 0affcac9-a3a5-e081-a588-00000000004d 30575 1726867567.22388: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30575 1726867567.22437: no more pending results, returning what we have 30575 1726867567.22441: results queue empty 30575 1726867567.22441: checking for any_errors_fatal 30575 1726867567.22447: done checking for any_errors_fatal 30575 1726867567.22448: checking for max_fail_percentage 30575 1726867567.22449: done checking for max_fail_percentage 30575 1726867567.22450: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.22451: done checking to see if all hosts have failed 30575 1726867567.22451: getting the remaining hosts for this loop 30575 1726867567.22453: done getting the remaining hosts for this loop 30575 1726867567.22456: getting the next task for host managed_node3 30575 1726867567.22462: done getting next task for host managed_node3 30575 1726867567.22465: ^ task is: TASK: Enable EPEL 7 30575 1726867567.22469: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.22473: getting variables 30575 1726867567.22475: in VariableManager get_vars() 30575 1726867567.22503: Calling all_inventory to load vars for managed_node3 30575 1726867567.22506: Calling groups_inventory to load vars for managed_node3 30575 1726867567.22510: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.22522: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.22527: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.22531: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.22888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.23097: done with get_vars() 30575 1726867567.23106: done getting variables 30575 1726867567.23168: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:26:07 -0400 (0:00:00.025) 0:00:02.609 ****** 30575 1726867567.23196: entering _queue_task() for managed_node3/command 30575 1726867567.23416: worker is 1 (out of 1 available) 30575 1726867567.23429: exiting _queue_task() for managed_node3/command 30575 1726867567.23440: done queuing things up, now waiting for results queue to drain 30575 1726867567.23442: waiting for pending results... 30575 1726867567.23791: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 30575 1726867567.23796: in run() - task 0affcac9-a3a5-e081-a588-00000000004e 30575 1726867567.23804: variable 'ansible_search_path' from source: unknown 30575 1726867567.23813: variable 'ansible_search_path' from source: unknown 30575 1726867567.23854: calling self._execute() 30575 1726867567.23995: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.23999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.24001: variable 'omit' from source: magic vars 30575 1726867567.24550: variable 'ansible_distribution' from source: facts 30575 1726867567.24566: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30575 1726867567.24703: variable 'ansible_distribution_major_version' from source: facts 30575 1726867567.24715: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30575 1726867567.24726: when evaluation is False, skipping this task 30575 1726867567.24735: _execute() done 30575 1726867567.24743: dumping result to json 30575 1726867567.24758: done dumping result, returning 30575 1726867567.24775: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affcac9-a3a5-e081-a588-00000000004e] 30575 1726867567.24875: sending task result for task 0affcac9-a3a5-e081-a588-00000000004e 30575 1726867567.24942: done sending task result for task 0affcac9-a3a5-e081-a588-00000000004e 30575 1726867567.24945: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30575 1726867567.25027: no more pending results, returning what we have 30575 1726867567.25031: results queue empty 30575 1726867567.25032: checking for any_errors_fatal 30575 1726867567.25036: done checking for any_errors_fatal 30575 1726867567.25037: checking for max_fail_percentage 30575 1726867567.25039: done checking for max_fail_percentage 30575 1726867567.25040: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.25041: done checking to see if all hosts have failed 30575 1726867567.25041: getting the remaining hosts for this loop 30575 1726867567.25043: done getting the remaining hosts for this loop 30575 1726867567.25047: getting the next task for host managed_node3 30575 1726867567.25055: done getting next task for host managed_node3 30575 1726867567.25058: ^ task is: TASK: Enable EPEL 8 30575 1726867567.25061: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.25065: getting variables 30575 1726867567.25067: in VariableManager get_vars() 30575 1726867567.25102: Calling all_inventory to load vars for managed_node3 30575 1726867567.25105: Calling groups_inventory to load vars for managed_node3 30575 1726867567.25109: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.25121: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.25127: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.25130: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.25458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.25660: done with get_vars() 30575 1726867567.25670: done getting variables 30575 1726867567.25720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:26:07 -0400 (0:00:00.025) 0:00:02.635 ****** 30575 1726867567.25750: entering _queue_task() for managed_node3/command 30575 1726867567.26184: worker is 1 (out of 1 available) 30575 1726867567.26292: exiting _queue_task() for managed_node3/command 30575 1726867567.26413: done queuing things up, now waiting for results queue to drain 30575 1726867567.26415: waiting for pending results... 30575 1726867567.26667: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 30575 1726867567.26719: in run() - task 0affcac9-a3a5-e081-a588-00000000004f 30575 1726867567.26746: variable 'ansible_search_path' from source: unknown 30575 1726867567.26755: variable 'ansible_search_path' from source: unknown 30575 1726867567.26802: calling self._execute() 30575 1726867567.26882: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.26894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.26907: variable 'omit' from source: magic vars 30575 1726867567.27297: variable 'ansible_distribution' from source: facts 30575 1726867567.27320: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30575 1726867567.27459: variable 'ansible_distribution_major_version' from source: facts 30575 1726867567.27470: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30575 1726867567.27510: when evaluation is False, skipping this task 30575 1726867567.27514: _execute() done 30575 1726867567.27516: dumping result to json 30575 1726867567.27521: done dumping result, returning 30575 1726867567.27530: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affcac9-a3a5-e081-a588-00000000004f] 30575 1726867567.27532: sending task result for task 0affcac9-a3a5-e081-a588-00000000004f skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30575 1726867567.27737: no more pending results, returning what we have 30575 1726867567.27740: results queue empty 30575 1726867567.27741: checking for any_errors_fatal 30575 1726867567.27746: done checking for any_errors_fatal 30575 1726867567.27747: checking for max_fail_percentage 30575 1726867567.27748: done checking for max_fail_percentage 30575 1726867567.27749: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.27750: done checking to see if all hosts have failed 30575 1726867567.27751: getting the remaining hosts for this loop 30575 1726867567.27752: done getting the remaining hosts for this loop 30575 1726867567.27756: getting the next task for host managed_node3 30575 1726867567.27765: done getting next task for host managed_node3 30575 1726867567.27767: ^ task is: TASK: Enable EPEL 6 30575 1726867567.27771: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.27776: getting variables 30575 1726867567.27780: in VariableManager get_vars() 30575 1726867567.27808: Calling all_inventory to load vars for managed_node3 30575 1726867567.27810: Calling groups_inventory to load vars for managed_node3 30575 1726867567.27814: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.27829: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.27834: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.27838: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.28215: done sending task result for task 0affcac9-a3a5-e081-a588-00000000004f 30575 1726867567.28218: WORKER PROCESS EXITING 30575 1726867567.28244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.28442: done with get_vars() 30575 1726867567.28451: done getting variables 30575 1726867567.28508: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:26:07 -0400 (0:00:00.027) 0:00:02.662 ****** 30575 1726867567.28542: entering _queue_task() for managed_node3/copy 30575 1726867567.28991: worker is 1 (out of 1 available) 30575 1726867567.28999: exiting _queue_task() for managed_node3/copy 30575 1726867567.29008: done queuing things up, now waiting for results queue to drain 30575 1726867567.29009: waiting for pending results... 30575 1726867567.29035: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 30575 1726867567.29141: in run() - task 0affcac9-a3a5-e081-a588-000000000051 30575 1726867567.29236: variable 'ansible_search_path' from source: unknown 30575 1726867567.29240: variable 'ansible_search_path' from source: unknown 30575 1726867567.29243: calling self._execute() 30575 1726867567.29273: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.29286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.29301: variable 'omit' from source: magic vars 30575 1726867567.29737: variable 'ansible_distribution' from source: facts 30575 1726867567.29755: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30575 1726867567.29872: variable 'ansible_distribution_major_version' from source: facts 30575 1726867567.29890: Evaluated conditional (ansible_distribution_major_version == '6'): False 30575 1726867567.29897: when evaluation is False, skipping this task 30575 1726867567.29903: _execute() done 30575 1726867567.29909: dumping result to json 30575 1726867567.29915: done dumping result, returning 30575 1726867567.29926: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affcac9-a3a5-e081-a588-000000000051] 30575 1726867567.29935: sending task result for task 0affcac9-a3a5-e081-a588-000000000051 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30575 1726867567.30176: no more pending results, returning what we have 30575 1726867567.30180: results queue empty 30575 1726867567.30181: checking for any_errors_fatal 30575 1726867567.30184: done checking for any_errors_fatal 30575 1726867567.30185: checking for max_fail_percentage 30575 1726867567.30186: done checking for max_fail_percentage 30575 1726867567.30187: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.30188: done checking to see if all hosts have failed 30575 1726867567.30189: getting the remaining hosts for this loop 30575 1726867567.30190: done getting the remaining hosts for this loop 30575 1726867567.30193: getting the next task for host managed_node3 30575 1726867567.30200: done getting next task for host managed_node3 30575 1726867567.30202: ^ task is: TASK: Set network provider to 'nm' 30575 1726867567.30206: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.30209: getting variables 30575 1726867567.30210: in VariableManager get_vars() 30575 1726867567.30238: Calling all_inventory to load vars for managed_node3 30575 1726867567.30240: Calling groups_inventory to load vars for managed_node3 30575 1726867567.30243: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.30253: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.30256: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.30258: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.30580: done sending task result for task 0affcac9-a3a5-e081-a588-000000000051 30575 1726867567.30584: WORKER PROCESS EXITING 30575 1726867567.30609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.30799: done with get_vars() 30575 1726867567.30807: done getting variables 30575 1726867567.30864: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Friday 20 September 2024 17:26:07 -0400 (0:00:00.023) 0:00:02.686 ****** 30575 1726867567.30891: entering _queue_task() for managed_node3/set_fact 30575 1726867567.31109: worker is 1 (out of 1 available) 30575 1726867567.31120: exiting _queue_task() for managed_node3/set_fact 30575 1726867567.31134: done queuing things up, now waiting for results queue to drain 30575 1726867567.31249: waiting for pending results... 30575 1726867567.31384: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 30575 1726867567.31459: in run() - task 0affcac9-a3a5-e081-a588-000000000007 30575 1726867567.31486: variable 'ansible_search_path' from source: unknown 30575 1726867567.31521: calling self._execute() 30575 1726867567.31601: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.31612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.31629: variable 'omit' from source: magic vars 30575 1726867567.31737: variable 'omit' from source: magic vars 30575 1726867567.31772: variable 'omit' from source: magic vars 30575 1726867567.31820: variable 'omit' from source: magic vars 30575 1726867567.31916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867567.31931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867567.31958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867567.31982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867567.32000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867567.32044: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867567.32054: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.32061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.32172: Set connection var ansible_pipelining to False 30575 1726867567.32185: Set connection var ansible_shell_type to sh 30575 1726867567.32231: Set connection var ansible_shell_executable to /bin/sh 30575 1726867567.32235: Set connection var ansible_timeout to 10 30575 1726867567.32237: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867567.32239: Set connection var ansible_connection to ssh 30575 1726867567.32257: variable 'ansible_shell_executable' from source: unknown 30575 1726867567.32264: variable 'ansible_connection' from source: unknown 30575 1726867567.32270: variable 'ansible_module_compression' from source: unknown 30575 1726867567.32276: variable 'ansible_shell_type' from source: unknown 30575 1726867567.32341: variable 'ansible_shell_executable' from source: unknown 30575 1726867567.32344: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.32347: variable 'ansible_pipelining' from source: unknown 30575 1726867567.32349: variable 'ansible_timeout' from source: unknown 30575 1726867567.32351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.32459: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867567.32475: variable 'omit' from source: magic vars 30575 1726867567.32487: starting attempt loop 30575 1726867567.32493: running the handler 30575 1726867567.32507: handler run complete 30575 1726867567.32521: attempt loop complete, returning result 30575 1726867567.32531: _execute() done 30575 1726867567.32537: dumping result to json 30575 1726867567.32558: done dumping result, returning 30575 1726867567.32562: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affcac9-a3a5-e081-a588-000000000007] 30575 1726867567.32667: sending task result for task 0affcac9-a3a5-e081-a588-000000000007 30575 1726867567.32730: done sending task result for task 0affcac9-a3a5-e081-a588-000000000007 30575 1726867567.32733: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 30575 1726867567.32819: no more pending results, returning what we have 30575 1726867567.32825: results queue empty 30575 1726867567.32826: checking for any_errors_fatal 30575 1726867567.32831: done checking for any_errors_fatal 30575 1726867567.32831: checking for max_fail_percentage 30575 1726867567.32833: done checking for max_fail_percentage 30575 1726867567.32834: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.32835: done checking to see if all hosts have failed 30575 1726867567.32836: getting the remaining hosts for this loop 30575 1726867567.32837: done getting the remaining hosts for this loop 30575 1726867567.32842: getting the next task for host managed_node3 30575 1726867567.32849: done getting next task for host managed_node3 30575 1726867567.32851: ^ task is: TASK: meta (flush_handlers) 30575 1726867567.32853: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.32857: getting variables 30575 1726867567.32859: in VariableManager get_vars() 30575 1726867567.32889: Calling all_inventory to load vars for managed_node3 30575 1726867567.32891: Calling groups_inventory to load vars for managed_node3 30575 1726867567.32895: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.32904: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.32907: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.32910: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.33193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.33500: done with get_vars() 30575 1726867567.33509: done getting variables 30575 1726867567.33574: in VariableManager get_vars() 30575 1726867567.33586: Calling all_inventory to load vars for managed_node3 30575 1726867567.33589: Calling groups_inventory to load vars for managed_node3 30575 1726867567.33591: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.33595: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.33597: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.33600: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.33746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.33934: done with get_vars() 30575 1726867567.33946: done queuing things up, now waiting for results queue to drain 30575 1726867567.33948: results queue empty 30575 1726867567.33949: checking for any_errors_fatal 30575 1726867567.33951: done checking for any_errors_fatal 30575 1726867567.33952: checking for max_fail_percentage 30575 1726867567.33953: done checking for max_fail_percentage 30575 1726867567.33953: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.33954: done checking to see if all hosts have failed 30575 1726867567.33955: getting the remaining hosts for this loop 30575 1726867567.33956: done getting the remaining hosts for this loop 30575 1726867567.33962: getting the next task for host managed_node3 30575 1726867567.33965: done getting next task for host managed_node3 30575 1726867567.33967: ^ task is: TASK: meta (flush_handlers) 30575 1726867567.33968: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.33975: getting variables 30575 1726867567.33976: in VariableManager get_vars() 30575 1726867567.33986: Calling all_inventory to load vars for managed_node3 30575 1726867567.33988: Calling groups_inventory to load vars for managed_node3 30575 1726867567.33990: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.33994: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.33996: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.33999: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.34139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.34349: done with get_vars() 30575 1726867567.34356: done getting variables 30575 1726867567.34406: in VariableManager get_vars() 30575 1726867567.34414: Calling all_inventory to load vars for managed_node3 30575 1726867567.34416: Calling groups_inventory to load vars for managed_node3 30575 1726867567.34418: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.34422: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.34428: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.34431: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.34565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.34753: done with get_vars() 30575 1726867567.34764: done queuing things up, now waiting for results queue to drain 30575 1726867567.34765: results queue empty 30575 1726867567.34766: checking for any_errors_fatal 30575 1726867567.34767: done checking for any_errors_fatal 30575 1726867567.34768: checking for max_fail_percentage 30575 1726867567.34769: done checking for max_fail_percentage 30575 1726867567.34769: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.34770: done checking to see if all hosts have failed 30575 1726867567.34771: getting the remaining hosts for this loop 30575 1726867567.34772: done getting the remaining hosts for this loop 30575 1726867567.34774: getting the next task for host managed_node3 30575 1726867567.34776: done getting next task for host managed_node3 30575 1726867567.34779: ^ task is: None 30575 1726867567.34780: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.34781: done queuing things up, now waiting for results queue to drain 30575 1726867567.34782: results queue empty 30575 1726867567.34783: checking for any_errors_fatal 30575 1726867567.34784: done checking for any_errors_fatal 30575 1726867567.34784: checking for max_fail_percentage 30575 1726867567.34785: done checking for max_fail_percentage 30575 1726867567.34786: checking to see if all hosts have failed and the running result is not ok 30575 1726867567.34786: done checking to see if all hosts have failed 30575 1726867567.34788: getting the next task for host managed_node3 30575 1726867567.34790: done getting next task for host managed_node3 30575 1726867567.34791: ^ task is: None 30575 1726867567.34792: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.34843: in VariableManager get_vars() 30575 1726867567.34858: done with get_vars() 30575 1726867567.34864: in VariableManager get_vars() 30575 1726867567.34874: done with get_vars() 30575 1726867567.34881: variable 'omit' from source: magic vars 30575 1726867567.34912: in VariableManager get_vars() 30575 1726867567.34921: done with get_vars() 30575 1726867567.34951: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 30575 1726867567.35560: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 30575 1726867567.35671: getting the remaining hosts for this loop 30575 1726867567.35673: done getting the remaining hosts for this loop 30575 1726867567.35675: getting the next task for host managed_node3 30575 1726867567.35680: done getting next task for host managed_node3 30575 1726867567.35682: ^ task is: TASK: Gathering Facts 30575 1726867567.35683: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867567.35685: getting variables 30575 1726867567.35686: in VariableManager get_vars() 30575 1726867567.35693: Calling all_inventory to load vars for managed_node3 30575 1726867567.35809: Calling groups_inventory to load vars for managed_node3 30575 1726867567.35812: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867567.35817: Calling all_plugins_play to load vars for managed_node3 30575 1726867567.35833: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867567.35837: Calling groups_plugins_play to load vars for managed_node3 30575 1726867567.36046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867567.36342: done with get_vars() 30575 1726867567.36490: done getting variables 30575 1726867567.36530: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Friday 20 September 2024 17:26:07 -0400 (0:00:00.056) 0:00:02.743 ****** 30575 1726867567.36552: entering _queue_task() for managed_node3/gather_facts 30575 1726867567.36953: worker is 1 (out of 1 available) 30575 1726867567.36965: exiting _queue_task() for managed_node3/gather_facts 30575 1726867567.36976: done queuing things up, now waiting for results queue to drain 30575 1726867567.37197: waiting for pending results... 30575 1726867567.37594: running TaskExecutor() for managed_node3/TASK: Gathering Facts 30575 1726867567.37597: in run() - task 0affcac9-a3a5-e081-a588-000000000077 30575 1726867567.37601: variable 'ansible_search_path' from source: unknown 30575 1726867567.37636: calling self._execute() 30575 1726867567.37936: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.37940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.37942: variable 'omit' from source: magic vars 30575 1726867567.38414: variable 'ansible_distribution_major_version' from source: facts 30575 1726867567.38609: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867567.38620: variable 'omit' from source: magic vars 30575 1726867567.38649: variable 'omit' from source: magic vars 30575 1726867567.38789: variable 'omit' from source: magic vars 30575 1726867567.38836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867567.38957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867567.38982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867567.39003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867567.39045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867567.39076: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867567.39145: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.39153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.39365: Set connection var ansible_pipelining to False 30575 1726867567.39373: Set connection var ansible_shell_type to sh 30575 1726867567.39385: Set connection var ansible_shell_executable to /bin/sh 30575 1726867567.39573: Set connection var ansible_timeout to 10 30575 1726867567.39576: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867567.39580: Set connection var ansible_connection to ssh 30575 1726867567.39582: variable 'ansible_shell_executable' from source: unknown 30575 1726867567.39584: variable 'ansible_connection' from source: unknown 30575 1726867567.39585: variable 'ansible_module_compression' from source: unknown 30575 1726867567.39588: variable 'ansible_shell_type' from source: unknown 30575 1726867567.39589: variable 'ansible_shell_executable' from source: unknown 30575 1726867567.39591: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867567.39593: variable 'ansible_pipelining' from source: unknown 30575 1726867567.39595: variable 'ansible_timeout' from source: unknown 30575 1726867567.39597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867567.39931: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867567.39946: variable 'omit' from source: magic vars 30575 1726867567.39956: starting attempt loop 30575 1726867567.39962: running the handler 30575 1726867567.40182: variable 'ansible_facts' from source: unknown 30575 1726867567.40186: _low_level_execute_command(): starting 30575 1726867567.40188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867567.41499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867567.41598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867567.41601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867567.41603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867567.41792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867567.43596: stdout chunk (state=3): >>>/root <<< 30575 1726867567.43608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867567.43654: stderr chunk (state=3): >>><<< 30575 1726867567.43657: stdout chunk (state=3): >>><<< 30575 1726867567.43679: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867567.43699: _low_level_execute_command(): starting 30575 1726867567.43708: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846 `" && echo ansible-tmp-1726867567.4368558-30706-134441465009846="` echo /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846 `" ) && sleep 0' 30575 1726867567.44912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867567.44915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867567.44917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867567.44920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867567.44928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867567.45087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867567.45099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867567.45157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867567.47041: stdout chunk (state=3): >>>ansible-tmp-1726867567.4368558-30706-134441465009846=/root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846 <<< 30575 1726867567.47190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867567.47246: stdout chunk (state=3): >>><<< 30575 1726867567.47263: stderr chunk (state=3): >>><<< 30575 1726867567.47286: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867567.4368558-30706-134441465009846=/root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867567.47341: variable 'ansible_module_compression' from source: unknown 30575 1726867567.47437: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30575 1726867567.47645: variable 'ansible_facts' from source: unknown 30575 1726867567.48680: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/AnsiballZ_setup.py 30575 1726867567.48899: Sending initial data 30575 1726867567.48907: Sent initial data (154 bytes) 30575 1726867567.50287: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867567.50499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867567.50563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867567.52206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30575 1726867567.52221: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867567.52311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867567.52364: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpt_ve_ke0 /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/AnsiballZ_setup.py <<< 30575 1726867567.52379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/AnsiballZ_setup.py" <<< 30575 1726867567.52457: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpt_ve_ke0" to remote "/root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/AnsiballZ_setup.py" <<< 30575 1726867567.52471: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/AnsiballZ_setup.py" <<< 30575 1726867567.55305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867567.55309: stdout chunk (state=3): >>><<< 30575 1726867567.55311: stderr chunk (state=3): >>><<< 30575 1726867567.55313: done transferring module to remote 30575 1726867567.55315: _low_level_execute_command(): starting 30575 1726867567.55317: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/ /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/AnsiballZ_setup.py && sleep 0' 30575 1726867567.56288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867567.56292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867567.56509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867567.56520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867567.56576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867567.56592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867567.56792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867567.58527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867567.58566: stderr chunk (state=3): >>><<< 30575 1726867567.58590: stdout chunk (state=3): >>><<< 30575 1726867567.58887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867567.58890: _low_level_execute_command(): starting 30575 1726867567.58892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/AnsiballZ_setup.py && sleep 0' 30575 1726867567.59789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867567.59974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867567.59990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867567.60392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.20589: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "26", "second": "07", "epoch": "1726867567", "epoch_int": "1726867567", "date": "2024-09-20", "time": "17:26:07", "iso8601_micro": "2024-09-20T21:26:07.869461Z", "iso8601": "2024-09-20T21:26:07Z", "iso8601_basic": "20240920T172607869461", "iso8601_basic_short": "20240920T172607", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.8916015625, "5m": 0.640625, "15m": 0.35888671875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33<<< 30575 1726867568.20618: stdout chunk (state=3): >>>, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2979, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 552, "free": 2979}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e<<< 30575 1726867568.20632: stdout chunk (state=3): >>>9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 805, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261800370176, "block_size": 4096, "block_total": 65519099, "block_available": 63916106, "block_used": 1602993, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30575 1726867568.22570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867568.22604: stderr chunk (state=3): >>><<< 30575 1726867568.22607: stdout chunk (state=3): >>><<< 30575 1726867568.22638: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "26", "second": "07", "epoch": "1726867567", "epoch_int": "1726867567", "date": "2024-09-20", "time": "17:26:07", "iso8601_micro": "2024-09-20T21:26:07.869461Z", "iso8601": "2024-09-20T21:26:07Z", "iso8601_basic": "20240920T172607869461", "iso8601_basic_short": "20240920T172607", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.8916015625, "5m": 0.640625, "15m": 0.35888671875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2979, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 552, "free": 2979}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 805, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261800370176, "block_size": 4096, "block_total": 65519099, "block_available": 63916106, "block_used": 1602993, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867568.22861: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867568.22881: _low_level_execute_command(): starting 30575 1726867568.22884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867567.4368558-30706-134441465009846/ > /dev/null 2>&1 && sleep 0' 30575 1726867568.23328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867568.23331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.23343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.23398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867568.23405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867568.23446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.25276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867568.25301: stderr chunk (state=3): >>><<< 30575 1726867568.25306: stdout chunk (state=3): >>><<< 30575 1726867568.25318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867568.25326: handler run complete 30575 1726867568.25400: variable 'ansible_facts' from source: unknown 30575 1726867568.25460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.25642: variable 'ansible_facts' from source: unknown 30575 1726867568.25696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.25771: attempt loop complete, returning result 30575 1726867568.25774: _execute() done 30575 1726867568.25779: dumping result to json 30575 1726867568.25801: done dumping result, returning 30575 1726867568.25808: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcac9-a3a5-e081-a588-000000000077] 30575 1726867568.25813: sending task result for task 0affcac9-a3a5-e081-a588-000000000077 30575 1726867568.26059: done sending task result for task 0affcac9-a3a5-e081-a588-000000000077 30575 1726867568.26062: WORKER PROCESS EXITING ok: [managed_node3] 30575 1726867568.26281: no more pending results, returning what we have 30575 1726867568.26283: results queue empty 30575 1726867568.26284: checking for any_errors_fatal 30575 1726867568.26284: done checking for any_errors_fatal 30575 1726867568.26285: checking for max_fail_percentage 30575 1726867568.26286: done checking for max_fail_percentage 30575 1726867568.26286: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.26287: done checking to see if all hosts have failed 30575 1726867568.26287: getting the remaining hosts for this loop 30575 1726867568.26288: done getting the remaining hosts for this loop 30575 1726867568.26291: getting the next task for host managed_node3 30575 1726867568.26295: done getting next task for host managed_node3 30575 1726867568.26297: ^ task is: TASK: meta (flush_handlers) 30575 1726867568.26298: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.26300: getting variables 30575 1726867568.26301: in VariableManager get_vars() 30575 1726867568.26318: Calling all_inventory to load vars for managed_node3 30575 1726867568.26320: Calling groups_inventory to load vars for managed_node3 30575 1726867568.26321: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.26332: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.26333: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.26335: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.26441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.26550: done with get_vars() 30575 1726867568.26557: done getting variables 30575 1726867568.26607: in VariableManager get_vars() 30575 1726867568.26613: Calling all_inventory to load vars for managed_node3 30575 1726867568.26615: Calling groups_inventory to load vars for managed_node3 30575 1726867568.26616: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.26620: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.26622: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.26626: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.26732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.26909: done with get_vars() 30575 1726867568.26922: done queuing things up, now waiting for results queue to drain 30575 1726867568.26926: results queue empty 30575 1726867568.26927: checking for any_errors_fatal 30575 1726867568.26930: done checking for any_errors_fatal 30575 1726867568.26931: checking for max_fail_percentage 30575 1726867568.26932: done checking for max_fail_percentage 30575 1726867568.26932: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.26937: done checking to see if all hosts have failed 30575 1726867568.26938: getting the remaining hosts for this loop 30575 1726867568.26939: done getting the remaining hosts for this loop 30575 1726867568.26941: getting the next task for host managed_node3 30575 1726867568.26945: done getting next task for host managed_node3 30575 1726867568.26947: ^ task is: TASK: Show playbook name 30575 1726867568.26948: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.26950: getting variables 30575 1726867568.26951: in VariableManager get_vars() 30575 1726867568.26959: Calling all_inventory to load vars for managed_node3 30575 1726867568.26961: Calling groups_inventory to load vars for managed_node3 30575 1726867568.26963: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.26967: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.26969: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.26971: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.27131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.27532: done with get_vars() 30575 1726867568.27540: done getting variables 30575 1726867568.27618: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Friday 20 September 2024 17:26:08 -0400 (0:00:00.910) 0:00:03.654 ****** 30575 1726867568.27647: entering _queue_task() for managed_node3/debug 30575 1726867568.27649: Creating lock for debug 30575 1726867568.28342: worker is 1 (out of 1 available) 30575 1726867568.28354: exiting _queue_task() for managed_node3/debug 30575 1726867568.28364: done queuing things up, now waiting for results queue to drain 30575 1726867568.28365: waiting for pending results... 30575 1726867568.29007: running TaskExecutor() for managed_node3/TASK: Show playbook name 30575 1726867568.29012: in run() - task 0affcac9-a3a5-e081-a588-00000000000b 30575 1726867568.29016: variable 'ansible_search_path' from source: unknown 30575 1726867568.29018: calling self._execute() 30575 1726867568.29190: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.29195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.29198: variable 'omit' from source: magic vars 30575 1726867568.29606: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.29630: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.29643: variable 'omit' from source: magic vars 30575 1726867568.29675: variable 'omit' from source: magic vars 30575 1726867568.29720: variable 'omit' from source: magic vars 30575 1726867568.29768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867568.29810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.29839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867568.29864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.29886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.29925: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.29937: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.30082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.30085: Set connection var ansible_pipelining to False 30575 1726867568.30088: Set connection var ansible_shell_type to sh 30575 1726867568.30090: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.30092: Set connection var ansible_timeout to 10 30575 1726867568.30093: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.30095: Set connection var ansible_connection to ssh 30575 1726867568.30108: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.30115: variable 'ansible_connection' from source: unknown 30575 1726867568.30122: variable 'ansible_module_compression' from source: unknown 30575 1726867568.30134: variable 'ansible_shell_type' from source: unknown 30575 1726867568.30140: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.30146: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.30154: variable 'ansible_pipelining' from source: unknown 30575 1726867568.30160: variable 'ansible_timeout' from source: unknown 30575 1726867568.30166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.30307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.30322: variable 'omit' from source: magic vars 30575 1726867568.30337: starting attempt loop 30575 1726867568.30345: running the handler 30575 1726867568.30397: handler run complete 30575 1726867568.30430: attempt loop complete, returning result 30575 1726867568.30438: _execute() done 30575 1726867568.30446: dumping result to json 30575 1726867568.30454: done dumping result, returning 30575 1726867568.30466: done running TaskExecutor() for managed_node3/TASK: Show playbook name [0affcac9-a3a5-e081-a588-00000000000b] 30575 1726867568.30478: sending task result for task 0affcac9-a3a5-e081-a588-00000000000b 30575 1726867568.30690: done sending task result for task 0affcac9-a3a5-e081-a588-00000000000b 30575 1726867568.30695: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: this is: playbooks/tests_states.yml 30575 1726867568.30742: no more pending results, returning what we have 30575 1726867568.30745: results queue empty 30575 1726867568.30746: checking for any_errors_fatal 30575 1726867568.30747: done checking for any_errors_fatal 30575 1726867568.30748: checking for max_fail_percentage 30575 1726867568.30749: done checking for max_fail_percentage 30575 1726867568.30750: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.30751: done checking to see if all hosts have failed 30575 1726867568.30751: getting the remaining hosts for this loop 30575 1726867568.30753: done getting the remaining hosts for this loop 30575 1726867568.30756: getting the next task for host managed_node3 30575 1726867568.30764: done getting next task for host managed_node3 30575 1726867568.30766: ^ task is: TASK: Include the task 'run_test.yml' 30575 1726867568.30768: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.30771: getting variables 30575 1726867568.30772: in VariableManager get_vars() 30575 1726867568.30805: Calling all_inventory to load vars for managed_node3 30575 1726867568.30808: Calling groups_inventory to load vars for managed_node3 30575 1726867568.30811: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.30819: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.30821: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.30823: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.31001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.31269: done with get_vars() 30575 1726867568.31291: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Friday 20 September 2024 17:26:08 -0400 (0:00:00.037) 0:00:03.691 ****** 30575 1726867568.31391: entering _queue_task() for managed_node3/include_tasks 30575 1726867568.31653: worker is 1 (out of 1 available) 30575 1726867568.31665: exiting _queue_task() for managed_node3/include_tasks 30575 1726867568.31989: done queuing things up, now waiting for results queue to drain 30575 1726867568.31991: waiting for pending results... 30575 1726867568.32391: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30575 1726867568.32424: in run() - task 0affcac9-a3a5-e081-a588-00000000000d 30575 1726867568.32504: variable 'ansible_search_path' from source: unknown 30575 1726867568.32544: calling self._execute() 30575 1726867568.32626: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.32638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.32651: variable 'omit' from source: magic vars 30575 1726867568.32996: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.33012: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.33185: _execute() done 30575 1726867568.33188: dumping result to json 30575 1726867568.33190: done dumping result, returning 30575 1726867568.33193: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-e081-a588-00000000000d] 30575 1726867568.33195: sending task result for task 0affcac9-a3a5-e081-a588-00000000000d 30575 1726867568.33270: done sending task result for task 0affcac9-a3a5-e081-a588-00000000000d 30575 1726867568.33274: WORKER PROCESS EXITING 30575 1726867568.33299: no more pending results, returning what we have 30575 1726867568.33302: in VariableManager get_vars() 30575 1726867568.33327: Calling all_inventory to load vars for managed_node3 30575 1726867568.33330: Calling groups_inventory to load vars for managed_node3 30575 1726867568.33332: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.33340: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.33342: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.33345: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.33549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.33743: done with get_vars() 30575 1726867568.33751: variable 'ansible_search_path' from source: unknown 30575 1726867568.33763: we have included files to process 30575 1726867568.33764: generating all_blocks data 30575 1726867568.33766: done generating all_blocks data 30575 1726867568.33767: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867568.33767: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867568.33770: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867568.34297: in VariableManager get_vars() 30575 1726867568.34312: done with get_vars() 30575 1726867568.34355: in VariableManager get_vars() 30575 1726867568.34369: done with get_vars() 30575 1726867568.34408: in VariableManager get_vars() 30575 1726867568.34422: done with get_vars() 30575 1726867568.34464: in VariableManager get_vars() 30575 1726867568.34481: done with get_vars() 30575 1726867568.34518: in VariableManager get_vars() 30575 1726867568.34532: done with get_vars() 30575 1726867568.34870: in VariableManager get_vars() 30575 1726867568.34890: done with get_vars() 30575 1726867568.34902: done processing included file 30575 1726867568.34904: iterating over new_blocks loaded from include file 30575 1726867568.34905: in VariableManager get_vars() 30575 1726867568.34914: done with get_vars() 30575 1726867568.34915: filtering new block on tags 30575 1726867568.35206: done filtering new block on tags 30575 1726867568.35215: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30575 1726867568.35221: extending task lists for all hosts with included blocks 30575 1726867568.35257: done extending task lists 30575 1726867568.35258: done processing included files 30575 1726867568.35259: results queue empty 30575 1726867568.35259: checking for any_errors_fatal 30575 1726867568.35262: done checking for any_errors_fatal 30575 1726867568.35263: checking for max_fail_percentage 30575 1726867568.35264: done checking for max_fail_percentage 30575 1726867568.35265: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.35266: done checking to see if all hosts have failed 30575 1726867568.35266: getting the remaining hosts for this loop 30575 1726867568.35267: done getting the remaining hosts for this loop 30575 1726867568.35270: getting the next task for host managed_node3 30575 1726867568.35274: done getting next task for host managed_node3 30575 1726867568.35276: ^ task is: TASK: TEST: {{ lsr_description }} 30575 1726867568.35280: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.35282: getting variables 30575 1726867568.35283: in VariableManager get_vars() 30575 1726867568.35291: Calling all_inventory to load vars for managed_node3 30575 1726867568.35293: Calling groups_inventory to load vars for managed_node3 30575 1726867568.35295: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.35300: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.35307: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.35310: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.35447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.35646: done with get_vars() 30575 1726867568.35654: done getting variables 30575 1726867568.35695: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867568.35813: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:26:08 -0400 (0:00:00.044) 0:00:03.736 ****** 30575 1726867568.35855: entering _queue_task() for managed_node3/debug 30575 1726867568.36383: worker is 1 (out of 1 available) 30575 1726867568.36392: exiting _queue_task() for managed_node3/debug 30575 1726867568.36403: done queuing things up, now waiting for results queue to drain 30575 1726867568.36404: waiting for pending results... 30575 1726867568.36480: running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile 30575 1726867568.36595: in run() - task 0affcac9-a3a5-e081-a588-000000000091 30575 1726867568.36615: variable 'ansible_search_path' from source: unknown 30575 1726867568.36629: variable 'ansible_search_path' from source: unknown 30575 1726867568.36673: calling self._execute() 30575 1726867568.36761: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.36775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.36851: variable 'omit' from source: magic vars 30575 1726867568.37184: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.37203: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.37214: variable 'omit' from source: magic vars 30575 1726867568.37254: variable 'omit' from source: magic vars 30575 1726867568.37364: variable 'lsr_description' from source: include params 30575 1726867568.37394: variable 'omit' from source: magic vars 30575 1726867568.37444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867568.37501: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.37520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867568.37610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.37616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.37619: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.37621: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.37623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.37718: Set connection var ansible_pipelining to False 30575 1726867568.37737: Set connection var ansible_shell_type to sh 30575 1726867568.37741: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.37828: Set connection var ansible_timeout to 10 30575 1726867568.37831: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.37834: Set connection var ansible_connection to ssh 30575 1726867568.37836: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.37838: variable 'ansible_connection' from source: unknown 30575 1726867568.37842: variable 'ansible_module_compression' from source: unknown 30575 1726867568.37844: variable 'ansible_shell_type' from source: unknown 30575 1726867568.37847: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.37849: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.37851: variable 'ansible_pipelining' from source: unknown 30575 1726867568.37853: variable 'ansible_timeout' from source: unknown 30575 1726867568.37855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.38045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.38049: variable 'omit' from source: magic vars 30575 1726867568.38051: starting attempt loop 30575 1726867568.38053: running the handler 30575 1726867568.38060: handler run complete 30575 1726867568.38086: attempt loop complete, returning result 30575 1726867568.38095: _execute() done 30575 1726867568.38102: dumping result to json 30575 1726867568.38110: done dumping result, returning 30575 1726867568.38120: done running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile [0affcac9-a3a5-e081-a588-000000000091] 30575 1726867568.38130: sending task result for task 0affcac9-a3a5-e081-a588-000000000091 ok: [managed_node3] => {} MSG: ########## I can create a profile ########## 30575 1726867568.38346: no more pending results, returning what we have 30575 1726867568.38350: results queue empty 30575 1726867568.38351: checking for any_errors_fatal 30575 1726867568.38353: done checking for any_errors_fatal 30575 1726867568.38353: checking for max_fail_percentage 30575 1726867568.38355: done checking for max_fail_percentage 30575 1726867568.38356: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.38357: done checking to see if all hosts have failed 30575 1726867568.38357: getting the remaining hosts for this loop 30575 1726867568.38359: done getting the remaining hosts for this loop 30575 1726867568.38363: getting the next task for host managed_node3 30575 1726867568.38375: done getting next task for host managed_node3 30575 1726867568.38381: ^ task is: TASK: Show item 30575 1726867568.38384: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.38389: getting variables 30575 1726867568.38391: in VariableManager get_vars() 30575 1726867568.38417: Calling all_inventory to load vars for managed_node3 30575 1726867568.38420: Calling groups_inventory to load vars for managed_node3 30575 1726867568.38423: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.38434: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.38437: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.38440: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.38821: done sending task result for task 0affcac9-a3a5-e081-a588-000000000091 30575 1726867568.38825: WORKER PROCESS EXITING 30575 1726867568.38846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.39043: done with get_vars() 30575 1726867568.39052: done getting variables 30575 1726867568.39106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:26:08 -0400 (0:00:00.032) 0:00:03.768 ****** 30575 1726867568.39136: entering _queue_task() for managed_node3/debug 30575 1726867568.39497: worker is 1 (out of 1 available) 30575 1726867568.39507: exiting _queue_task() for managed_node3/debug 30575 1726867568.39517: done queuing things up, now waiting for results queue to drain 30575 1726867568.39518: waiting for pending results... 30575 1726867568.39624: running TaskExecutor() for managed_node3/TASK: Show item 30575 1726867568.39726: in run() - task 0affcac9-a3a5-e081-a588-000000000092 30575 1726867568.39754: variable 'ansible_search_path' from source: unknown 30575 1726867568.39762: variable 'ansible_search_path' from source: unknown 30575 1726867568.39811: variable 'omit' from source: magic vars 30575 1726867568.39922: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.39936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.39958: variable 'omit' from source: magic vars 30575 1726867568.40259: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.40282: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.40291: variable 'omit' from source: magic vars 30575 1726867568.40325: variable 'omit' from source: magic vars 30575 1726867568.40365: variable 'item' from source: unknown 30575 1726867568.40439: variable 'item' from source: unknown 30575 1726867568.40460: variable 'omit' from source: magic vars 30575 1726867568.40510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867568.40550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.40571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867568.40599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.40620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.40652: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.40683: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.40686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.40778: Set connection var ansible_pipelining to False 30575 1726867568.40815: Set connection var ansible_shell_type to sh 30575 1726867568.40823: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.40825: Set connection var ansible_timeout to 10 30575 1726867568.40827: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.40830: Set connection var ansible_connection to ssh 30575 1726867568.40856: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.40865: variable 'ansible_connection' from source: unknown 30575 1726867568.40925: variable 'ansible_module_compression' from source: unknown 30575 1726867568.40932: variable 'ansible_shell_type' from source: unknown 30575 1726867568.40935: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.40937: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.40939: variable 'ansible_pipelining' from source: unknown 30575 1726867568.40942: variable 'ansible_timeout' from source: unknown 30575 1726867568.40944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.41053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.41069: variable 'omit' from source: magic vars 30575 1726867568.41081: starting attempt loop 30575 1726867568.41089: running the handler 30575 1726867568.41133: variable 'lsr_description' from source: include params 30575 1726867568.41205: variable 'lsr_description' from source: include params 30575 1726867568.41254: handler run complete 30575 1726867568.41258: attempt loop complete, returning result 30575 1726867568.41267: variable 'item' from source: unknown 30575 1726867568.41331: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 30575 1726867568.41695: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.41698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.41700: variable 'omit' from source: magic vars 30575 1726867568.41710: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.41720: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.41729: variable 'omit' from source: magic vars 30575 1726867568.41746: variable 'omit' from source: magic vars 30575 1726867568.41791: variable 'item' from source: unknown 30575 1726867568.41859: variable 'item' from source: unknown 30575 1726867568.41879: variable 'omit' from source: magic vars 30575 1726867568.41902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.41924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.41982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.41985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.41988: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.41990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.42044: Set connection var ansible_pipelining to False 30575 1726867568.42052: Set connection var ansible_shell_type to sh 30575 1726867568.42062: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.42074: Set connection var ansible_timeout to 10 30575 1726867568.42086: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.42097: Set connection var ansible_connection to ssh 30575 1726867568.42180: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.42185: variable 'ansible_connection' from source: unknown 30575 1726867568.42188: variable 'ansible_module_compression' from source: unknown 30575 1726867568.42190: variable 'ansible_shell_type' from source: unknown 30575 1726867568.42192: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.42194: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.42196: variable 'ansible_pipelining' from source: unknown 30575 1726867568.42198: variable 'ansible_timeout' from source: unknown 30575 1726867568.42200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.42315: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.42318: variable 'omit' from source: magic vars 30575 1726867568.42320: starting attempt loop 30575 1726867568.42323: running the handler 30575 1726867568.42325: variable 'lsr_setup' from source: include params 30575 1726867568.42387: variable 'lsr_setup' from source: include params 30575 1726867568.42435: handler run complete 30575 1726867568.42532: attempt loop complete, returning result 30575 1726867568.42535: variable 'item' from source: unknown 30575 1726867568.42538: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30575 1726867568.42750: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.42753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.42755: variable 'omit' from source: magic vars 30575 1726867568.42865: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.42878: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.42892: variable 'omit' from source: magic vars 30575 1726867568.42910: variable 'omit' from source: magic vars 30575 1726867568.42952: variable 'item' from source: unknown 30575 1726867568.43021: variable 'item' from source: unknown 30575 1726867568.43038: variable 'omit' from source: magic vars 30575 1726867568.43080: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.43083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.43085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.43087: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.43092: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.43099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.43171: Set connection var ansible_pipelining to False 30575 1726867568.43216: Set connection var ansible_shell_type to sh 30575 1726867568.43219: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.43221: Set connection var ansible_timeout to 10 30575 1726867568.43223: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.43225: Set connection var ansible_connection to ssh 30575 1726867568.43241: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.43248: variable 'ansible_connection' from source: unknown 30575 1726867568.43254: variable 'ansible_module_compression' from source: unknown 30575 1726867568.43259: variable 'ansible_shell_type' from source: unknown 30575 1726867568.43280: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.43283: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.43285: variable 'ansible_pipelining' from source: unknown 30575 1726867568.43291: variable 'ansible_timeout' from source: unknown 30575 1726867568.43292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.43368: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.43400: variable 'omit' from source: magic vars 30575 1726867568.43403: starting attempt loop 30575 1726867568.43405: running the handler 30575 1726867568.43412: variable 'lsr_test' from source: include params 30575 1726867568.43474: variable 'lsr_test' from source: include params 30575 1726867568.43509: handler run complete 30575 1726867568.43517: attempt loop complete, returning result 30575 1726867568.43533: variable 'item' from source: unknown 30575 1726867568.43618: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 30575 1726867568.43760: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.43763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.43772: variable 'omit' from source: magic vars 30575 1726867568.43944: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.43947: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.43949: variable 'omit' from source: magic vars 30575 1726867568.43957: variable 'omit' from source: magic vars 30575 1726867568.44002: variable 'item' from source: unknown 30575 1726867568.44084: variable 'item' from source: unknown 30575 1726867568.44093: variable 'omit' from source: magic vars 30575 1726867568.44113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.44161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.44164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.44167: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.44169: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.44171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.44232: Set connection var ansible_pipelining to False 30575 1726867568.44239: Set connection var ansible_shell_type to sh 30575 1726867568.44247: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.44256: Set connection var ansible_timeout to 10 30575 1726867568.44266: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.44301: Set connection var ansible_connection to ssh 30575 1726867568.44382: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.44385: variable 'ansible_connection' from source: unknown 30575 1726867568.44388: variable 'ansible_module_compression' from source: unknown 30575 1726867568.44389: variable 'ansible_shell_type' from source: unknown 30575 1726867568.44391: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.44393: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.44395: variable 'ansible_pipelining' from source: unknown 30575 1726867568.44397: variable 'ansible_timeout' from source: unknown 30575 1726867568.44399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.44437: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.44449: variable 'omit' from source: magic vars 30575 1726867568.44456: starting attempt loop 30575 1726867568.44462: running the handler 30575 1726867568.44485: variable 'lsr_assert' from source: include params 30575 1726867568.44556: variable 'lsr_assert' from source: include params 30575 1726867568.44574: handler run complete 30575 1726867568.44592: attempt loop complete, returning result 30575 1726867568.44609: variable 'item' from source: unknown 30575 1726867568.44709: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 30575 1726867568.44955: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.44959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.44961: variable 'omit' from source: magic vars 30575 1726867568.45019: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.45064: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.45071: variable 'omit' from source: magic vars 30575 1726867568.45073: variable 'omit' from source: magic vars 30575 1726867568.45104: variable 'item' from source: unknown 30575 1726867568.45170: variable 'item' from source: unknown 30575 1726867568.45192: variable 'omit' from source: magic vars 30575 1726867568.45283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.45286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.45288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.45290: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.45292: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.45294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.45325: Set connection var ansible_pipelining to False 30575 1726867568.45333: Set connection var ansible_shell_type to sh 30575 1726867568.45341: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.45349: Set connection var ansible_timeout to 10 30575 1726867568.45357: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.45367: Set connection var ansible_connection to ssh 30575 1726867568.45396: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.45482: variable 'ansible_connection' from source: unknown 30575 1726867568.45485: variable 'ansible_module_compression' from source: unknown 30575 1726867568.45487: variable 'ansible_shell_type' from source: unknown 30575 1726867568.45489: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.45491: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.45494: variable 'ansible_pipelining' from source: unknown 30575 1726867568.45496: variable 'ansible_timeout' from source: unknown 30575 1726867568.45498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.45530: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.45541: variable 'omit' from source: magic vars 30575 1726867568.45548: starting attempt loop 30575 1726867568.45554: running the handler 30575 1726867568.45574: variable 'lsr_assert_when' from source: include params 30575 1726867568.45645: variable 'lsr_assert_when' from source: include params 30575 1726867568.45741: variable 'network_provider' from source: set_fact 30575 1726867568.45775: handler run complete 30575 1726867568.45796: attempt loop complete, returning result 30575 1726867568.45831: variable 'item' from source: unknown 30575 1726867568.45884: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 30575 1726867568.46205: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.46208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.46210: variable 'omit' from source: magic vars 30575 1726867568.46212: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.46214: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.46216: variable 'omit' from source: magic vars 30575 1726867568.46218: variable 'omit' from source: magic vars 30575 1726867568.46258: variable 'item' from source: unknown 30575 1726867568.46321: variable 'item' from source: unknown 30575 1726867568.46347: variable 'omit' from source: magic vars 30575 1726867568.46368: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.46381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.46393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.46406: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.46414: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.46422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.46494: Set connection var ansible_pipelining to False 30575 1726867568.46501: Set connection var ansible_shell_type to sh 30575 1726867568.46508: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.46515: Set connection var ansible_timeout to 10 30575 1726867568.46521: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.46530: Set connection var ansible_connection to ssh 30575 1726867568.46556: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.46662: variable 'ansible_connection' from source: unknown 30575 1726867568.46665: variable 'ansible_module_compression' from source: unknown 30575 1726867568.46667: variable 'ansible_shell_type' from source: unknown 30575 1726867568.46669: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.46671: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.46673: variable 'ansible_pipelining' from source: unknown 30575 1726867568.46675: variable 'ansible_timeout' from source: unknown 30575 1726867568.46679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.46681: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.46683: variable 'omit' from source: magic vars 30575 1726867568.46685: starting attempt loop 30575 1726867568.46688: running the handler 30575 1726867568.46712: variable 'lsr_fail_debug' from source: play vars 30575 1726867568.46787: variable 'lsr_fail_debug' from source: play vars 30575 1726867568.46808: handler run complete 30575 1726867568.46825: attempt loop complete, returning result 30575 1726867568.46844: variable 'item' from source: unknown 30575 1726867568.46917: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30575 1726867568.47110: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.47113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.47116: variable 'omit' from source: magic vars 30575 1726867568.47242: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.47253: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.47261: variable 'omit' from source: magic vars 30575 1726867568.47281: variable 'omit' from source: magic vars 30575 1726867568.47328: variable 'item' from source: unknown 30575 1726867568.47395: variable 'item' from source: unknown 30575 1726867568.47414: variable 'omit' from source: magic vars 30575 1726867568.47443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.47462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.47473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.47547: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.47550: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.47553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.47580: Set connection var ansible_pipelining to False 30575 1726867568.47589: Set connection var ansible_shell_type to sh 30575 1726867568.47599: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.47609: Set connection var ansible_timeout to 10 30575 1726867568.47618: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.47629: Set connection var ansible_connection to ssh 30575 1726867568.47660: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.47669: variable 'ansible_connection' from source: unknown 30575 1726867568.47678: variable 'ansible_module_compression' from source: unknown 30575 1726867568.47763: variable 'ansible_shell_type' from source: unknown 30575 1726867568.47766: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.47769: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.47771: variable 'ansible_pipelining' from source: unknown 30575 1726867568.47773: variable 'ansible_timeout' from source: unknown 30575 1726867568.47775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.47807: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.47819: variable 'omit' from source: magic vars 30575 1726867568.47827: starting attempt loop 30575 1726867568.47834: running the handler 30575 1726867568.47856: variable 'lsr_cleanup' from source: include params 30575 1726867568.47925: variable 'lsr_cleanup' from source: include params 30575 1726867568.47946: handler run complete 30575 1726867568.47963: attempt loop complete, returning result 30575 1726867568.47990: variable 'item' from source: unknown 30575 1726867568.48052: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30575 1726867568.48244: dumping result to json 30575 1726867568.48247: done dumping result, returning 30575 1726867568.48249: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-e081-a588-000000000092] 30575 1726867568.48251: sending task result for task 0affcac9-a3a5-e081-a588-000000000092 30575 1726867568.48406: no more pending results, returning what we have 30575 1726867568.48410: results queue empty 30575 1726867568.48410: checking for any_errors_fatal 30575 1726867568.48415: done checking for any_errors_fatal 30575 1726867568.48416: checking for max_fail_percentage 30575 1726867568.48418: done checking for max_fail_percentage 30575 1726867568.48418: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.48419: done checking to see if all hosts have failed 30575 1726867568.48420: getting the remaining hosts for this loop 30575 1726867568.48422: done getting the remaining hosts for this loop 30575 1726867568.48425: getting the next task for host managed_node3 30575 1726867568.48433: done getting next task for host managed_node3 30575 1726867568.48436: ^ task is: TASK: Include the task 'show_interfaces.yml' 30575 1726867568.48438: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.48442: getting variables 30575 1726867568.48444: in VariableManager get_vars() 30575 1726867568.48472: Calling all_inventory to load vars for managed_node3 30575 1726867568.48475: Calling groups_inventory to load vars for managed_node3 30575 1726867568.48481: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.48493: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.48495: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.48498: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.48881: done sending task result for task 0affcac9-a3a5-e081-a588-000000000092 30575 1726867568.48885: WORKER PROCESS EXITING 30575 1726867568.48913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.49111: done with get_vars() 30575 1726867568.49126: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:26:08 -0400 (0:00:00.100) 0:00:03.869 ****** 30575 1726867568.49216: entering _queue_task() for managed_node3/include_tasks 30575 1726867568.49565: worker is 1 (out of 1 available) 30575 1726867568.49574: exiting _queue_task() for managed_node3/include_tasks 30575 1726867568.49588: done queuing things up, now waiting for results queue to drain 30575 1726867568.49590: waiting for pending results... 30575 1726867568.49725: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30575 1726867568.49827: in run() - task 0affcac9-a3a5-e081-a588-000000000093 30575 1726867568.49847: variable 'ansible_search_path' from source: unknown 30575 1726867568.49855: variable 'ansible_search_path' from source: unknown 30575 1726867568.49897: calling self._execute() 30575 1726867568.49970: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.49983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.50001: variable 'omit' from source: magic vars 30575 1726867568.50369: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.50389: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.50398: _execute() done 30575 1726867568.50436: dumping result to json 30575 1726867568.50440: done dumping result, returning 30575 1726867568.50442: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-e081-a588-000000000093] 30575 1726867568.50444: sending task result for task 0affcac9-a3a5-e081-a588-000000000093 30575 1726867568.50656: no more pending results, returning what we have 30575 1726867568.50660: in VariableManager get_vars() 30575 1726867568.50693: Calling all_inventory to load vars for managed_node3 30575 1726867568.50695: Calling groups_inventory to load vars for managed_node3 30575 1726867568.50699: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.50709: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.50712: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.50715: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.50952: done sending task result for task 0affcac9-a3a5-e081-a588-000000000093 30575 1726867568.50955: WORKER PROCESS EXITING 30575 1726867568.50981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.51171: done with get_vars() 30575 1726867568.51179: variable 'ansible_search_path' from source: unknown 30575 1726867568.51181: variable 'ansible_search_path' from source: unknown 30575 1726867568.51222: we have included files to process 30575 1726867568.51223: generating all_blocks data 30575 1726867568.51224: done generating all_blocks data 30575 1726867568.51228: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867568.51229: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867568.51231: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867568.51371: in VariableManager get_vars() 30575 1726867568.51390: done with get_vars() 30575 1726867568.51493: done processing included file 30575 1726867568.51495: iterating over new_blocks loaded from include file 30575 1726867568.51497: in VariableManager get_vars() 30575 1726867568.51508: done with get_vars() 30575 1726867568.51510: filtering new block on tags 30575 1726867568.51546: done filtering new block on tags 30575 1726867568.51548: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30575 1726867568.51552: extending task lists for all hosts with included blocks 30575 1726867568.52035: done extending task lists 30575 1726867568.52036: done processing included files 30575 1726867568.52037: results queue empty 30575 1726867568.52038: checking for any_errors_fatal 30575 1726867568.52041: done checking for any_errors_fatal 30575 1726867568.52042: checking for max_fail_percentage 30575 1726867568.52043: done checking for max_fail_percentage 30575 1726867568.52044: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.52045: done checking to see if all hosts have failed 30575 1726867568.52046: getting the remaining hosts for this loop 30575 1726867568.52047: done getting the remaining hosts for this loop 30575 1726867568.52049: getting the next task for host managed_node3 30575 1726867568.52053: done getting next task for host managed_node3 30575 1726867568.52055: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30575 1726867568.52058: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.52064: getting variables 30575 1726867568.52066: in VariableManager get_vars() 30575 1726867568.52074: Calling all_inventory to load vars for managed_node3 30575 1726867568.52076: Calling groups_inventory to load vars for managed_node3 30575 1726867568.52080: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.52085: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.52087: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.52090: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.52227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.52420: done with get_vars() 30575 1726867568.52428: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:26:08 -0400 (0:00:00.032) 0:00:03.902 ****** 30575 1726867568.52498: entering _queue_task() for managed_node3/include_tasks 30575 1726867568.52830: worker is 1 (out of 1 available) 30575 1726867568.52839: exiting _queue_task() for managed_node3/include_tasks 30575 1726867568.52848: done queuing things up, now waiting for results queue to drain 30575 1726867568.52849: waiting for pending results... 30575 1726867568.52976: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30575 1726867568.53094: in run() - task 0affcac9-a3a5-e081-a588-0000000000ba 30575 1726867568.53113: variable 'ansible_search_path' from source: unknown 30575 1726867568.53121: variable 'ansible_search_path' from source: unknown 30575 1726867568.53166: calling self._execute() 30575 1726867568.53247: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.53265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.53281: variable 'omit' from source: magic vars 30575 1726867568.53702: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.53705: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.53708: _execute() done 30575 1726867568.53710: dumping result to json 30575 1726867568.53713: done dumping result, returning 30575 1726867568.53716: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-e081-a588-0000000000ba] 30575 1726867568.53719: sending task result for task 0affcac9-a3a5-e081-a588-0000000000ba 30575 1726867568.53907: no more pending results, returning what we have 30575 1726867568.53912: in VariableManager get_vars() 30575 1726867568.53953: Calling all_inventory to load vars for managed_node3 30575 1726867568.53956: Calling groups_inventory to load vars for managed_node3 30575 1726867568.53960: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.53973: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.53976: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.53982: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.54274: done sending task result for task 0affcac9-a3a5-e081-a588-0000000000ba 30575 1726867568.54279: WORKER PROCESS EXITING 30575 1726867568.54298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.54519: done with get_vars() 30575 1726867568.54526: variable 'ansible_search_path' from source: unknown 30575 1726867568.54527: variable 'ansible_search_path' from source: unknown 30575 1726867568.54564: we have included files to process 30575 1726867568.54565: generating all_blocks data 30575 1726867568.54567: done generating all_blocks data 30575 1726867568.54568: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867568.54569: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867568.54571: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867568.54882: done processing included file 30575 1726867568.54884: iterating over new_blocks loaded from include file 30575 1726867568.54886: in VariableManager get_vars() 30575 1726867568.54898: done with get_vars() 30575 1726867568.54899: filtering new block on tags 30575 1726867568.54934: done filtering new block on tags 30575 1726867568.54937: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30575 1726867568.54941: extending task lists for all hosts with included blocks 30575 1726867568.55108: done extending task lists 30575 1726867568.55110: done processing included files 30575 1726867568.55111: results queue empty 30575 1726867568.55111: checking for any_errors_fatal 30575 1726867568.55114: done checking for any_errors_fatal 30575 1726867568.55115: checking for max_fail_percentage 30575 1726867568.55116: done checking for max_fail_percentage 30575 1726867568.55116: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.55117: done checking to see if all hosts have failed 30575 1726867568.55118: getting the remaining hosts for this loop 30575 1726867568.55119: done getting the remaining hosts for this loop 30575 1726867568.55121: getting the next task for host managed_node3 30575 1726867568.55126: done getting next task for host managed_node3 30575 1726867568.55128: ^ task is: TASK: Gather current interface info 30575 1726867568.55131: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.55133: getting variables 30575 1726867568.55134: in VariableManager get_vars() 30575 1726867568.55142: Calling all_inventory to load vars for managed_node3 30575 1726867568.55144: Calling groups_inventory to load vars for managed_node3 30575 1726867568.55146: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.55151: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.55153: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.55156: Calling groups_plugins_play to load vars for managed_node3 30575 1726867568.55302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867568.55505: done with get_vars() 30575 1726867568.55516: done getting variables 30575 1726867568.55548: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:26:08 -0400 (0:00:00.030) 0:00:03.933 ****** 30575 1726867568.55572: entering _queue_task() for managed_node3/command 30575 1726867568.55808: worker is 1 (out of 1 available) 30575 1726867568.55818: exiting _queue_task() for managed_node3/command 30575 1726867568.55831: done queuing things up, now waiting for results queue to drain 30575 1726867568.55832: waiting for pending results... 30575 1726867568.56063: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30575 1726867568.56172: in run() - task 0affcac9-a3a5-e081-a588-0000000000f5 30575 1726867568.56197: variable 'ansible_search_path' from source: unknown 30575 1726867568.56207: variable 'ansible_search_path' from source: unknown 30575 1726867568.56246: calling self._execute() 30575 1726867568.56324: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.56336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.56348: variable 'omit' from source: magic vars 30575 1726867568.56701: variable 'ansible_distribution_major_version' from source: facts 30575 1726867568.56725: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867568.56746: variable 'omit' from source: magic vars 30575 1726867568.56786: variable 'omit' from source: magic vars 30575 1726867568.56834: variable 'omit' from source: magic vars 30575 1726867568.56872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867568.56942: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867568.56946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867568.56953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.56975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867568.57010: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867568.57018: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.57025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.57159: Set connection var ansible_pipelining to False 30575 1726867568.57162: Set connection var ansible_shell_type to sh 30575 1726867568.57164: Set connection var ansible_shell_executable to /bin/sh 30575 1726867568.57166: Set connection var ansible_timeout to 10 30575 1726867568.57168: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867568.57170: Set connection var ansible_connection to ssh 30575 1726867568.57198: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.57205: variable 'ansible_connection' from source: unknown 30575 1726867568.57212: variable 'ansible_module_compression' from source: unknown 30575 1726867568.57218: variable 'ansible_shell_type' from source: unknown 30575 1726867568.57269: variable 'ansible_shell_executable' from source: unknown 30575 1726867568.57272: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867568.57275: variable 'ansible_pipelining' from source: unknown 30575 1726867568.57281: variable 'ansible_timeout' from source: unknown 30575 1726867568.57284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867568.57384: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867568.57404: variable 'omit' from source: magic vars 30575 1726867568.57414: starting attempt loop 30575 1726867568.57419: running the handler 30575 1726867568.57437: _low_level_execute_command(): starting 30575 1726867568.57483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867568.58271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867568.58319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867568.58366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.60057: stdout chunk (state=3): >>>/root <<< 30575 1726867568.60153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867568.60215: stderr chunk (state=3): >>><<< 30575 1726867568.60218: stdout chunk (state=3): >>><<< 30575 1726867568.60317: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867568.60320: _low_level_execute_command(): starting 30575 1726867568.60323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326 `" && echo ansible-tmp-1726867568.6023684-30752-94490969963326="` echo /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326 `" ) && sleep 0' 30575 1726867568.60833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867568.60847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867568.60862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867568.60883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867568.60982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867568.61013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867568.61093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.62968: stdout chunk (state=3): >>>ansible-tmp-1726867568.6023684-30752-94490969963326=/root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326 <<< 30575 1726867568.63090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867568.63132: stderr chunk (state=3): >>><<< 30575 1726867568.63145: stdout chunk (state=3): >>><<< 30575 1726867568.63167: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867568.6023684-30752-94490969963326=/root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867568.63204: variable 'ansible_module_compression' from source: unknown 30575 1726867568.63265: ANSIBALLZ: Using generic lock for ansible.legacy.command 30575 1726867568.63280: ANSIBALLZ: Acquiring lock 30575 1726867568.63288: ANSIBALLZ: Lock acquired: 140240646918832 30575 1726867568.63306: ANSIBALLZ: Creating module 30575 1726867568.74431: ANSIBALLZ: Writing module into payload 30575 1726867568.74490: ANSIBALLZ: Writing module 30575 1726867568.74512: ANSIBALLZ: Renaming module 30575 1726867568.74516: ANSIBALLZ: Done creating module 30575 1726867568.74528: variable 'ansible_facts' from source: unknown 30575 1726867568.74569: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/AnsiballZ_command.py 30575 1726867568.74665: Sending initial data 30575 1726867568.74669: Sent initial data (155 bytes) 30575 1726867568.75057: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867568.75095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867568.75098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.75100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867568.75102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867568.75105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.75140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867568.75153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867568.75206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.76808: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867568.76849: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867568.76897: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpj_7b9e5v /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/AnsiballZ_command.py <<< 30575 1726867568.76901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/AnsiballZ_command.py" <<< 30575 1726867568.76941: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpj_7b9e5v" to remote "/root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/AnsiballZ_command.py" <<< 30575 1726867568.77453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867568.77487: stderr chunk (state=3): >>><<< 30575 1726867568.77490: stdout chunk (state=3): >>><<< 30575 1726867568.77508: done transferring module to remote 30575 1726867568.77520: _low_level_execute_command(): starting 30575 1726867568.77526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/ /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/AnsiballZ_command.py && sleep 0' 30575 1726867568.77928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867568.77931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.77933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867568.77935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867568.77937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.77983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867568.78000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867568.78039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.79789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867568.79808: stderr chunk (state=3): >>><<< 30575 1726867568.79813: stdout chunk (state=3): >>><<< 30575 1726867568.79829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867568.79832: _low_level_execute_command(): starting 30575 1726867568.79835: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/AnsiballZ_command.py && sleep 0' 30575 1726867568.80222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867568.80225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.80227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867568.80229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867568.80231: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.80272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867568.80290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867568.80339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.95651: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:08.950913", "end": "2024-09-20 17:26:08.954189", "delta": "0:00:00.003276", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867568.97101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867568.97133: stderr chunk (state=3): >>><<< 30575 1726867568.97136: stdout chunk (state=3): >>><<< 30575 1726867568.97151: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:08.950913", "end": "2024-09-20 17:26:08.954189", "delta": "0:00:00.003276", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867568.97184: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867568.97191: _low_level_execute_command(): starting 30575 1726867568.97196: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867568.6023684-30752-94490969963326/ > /dev/null 2>&1 && sleep 0' 30575 1726867568.97654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867568.97657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.97659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867568.97662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867568.97670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867568.97726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867568.97731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867568.97733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867568.97768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867568.99583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867568.99605: stderr chunk (state=3): >>><<< 30575 1726867568.99608: stdout chunk (state=3): >>><<< 30575 1726867568.99620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867568.99628: handler run complete 30575 1726867568.99646: Evaluated conditional (False): False 30575 1726867568.99659: attempt loop complete, returning result 30575 1726867568.99662: _execute() done 30575 1726867568.99664: dumping result to json 30575 1726867568.99666: done dumping result, returning 30575 1726867568.99676: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-e081-a588-0000000000f5] 30575 1726867568.99682: sending task result for task 0affcac9-a3a5-e081-a588-0000000000f5 30575 1726867568.99772: done sending task result for task 0affcac9-a3a5-e081-a588-0000000000f5 30575 1726867568.99776: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003276", "end": "2024-09-20 17:26:08.954189", "rc": 0, "start": "2024-09-20 17:26:08.950913" } STDOUT: bonding_masters eth0 lo 30575 1726867568.99846: no more pending results, returning what we have 30575 1726867568.99850: results queue empty 30575 1726867568.99851: checking for any_errors_fatal 30575 1726867568.99852: done checking for any_errors_fatal 30575 1726867568.99853: checking for max_fail_percentage 30575 1726867568.99854: done checking for max_fail_percentage 30575 1726867568.99855: checking to see if all hosts have failed and the running result is not ok 30575 1726867568.99856: done checking to see if all hosts have failed 30575 1726867568.99857: getting the remaining hosts for this loop 30575 1726867568.99858: done getting the remaining hosts for this loop 30575 1726867568.99861: getting the next task for host managed_node3 30575 1726867568.99868: done getting next task for host managed_node3 30575 1726867568.99871: ^ task is: TASK: Set current_interfaces 30575 1726867568.99875: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867568.99881: getting variables 30575 1726867568.99883: in VariableManager get_vars() 30575 1726867568.99917: Calling all_inventory to load vars for managed_node3 30575 1726867568.99919: Calling groups_inventory to load vars for managed_node3 30575 1726867568.99923: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867568.99933: Calling all_plugins_play to load vars for managed_node3 30575 1726867568.99936: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867568.99938: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.00099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.00220: done with get_vars() 30575 1726867569.00230: done getting variables 30575 1726867569.00271: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:26:09 -0400 (0:00:00.447) 0:00:04.380 ****** 30575 1726867569.00295: entering _queue_task() for managed_node3/set_fact 30575 1726867569.00488: worker is 1 (out of 1 available) 30575 1726867569.00500: exiting _queue_task() for managed_node3/set_fact 30575 1726867569.00512: done queuing things up, now waiting for results queue to drain 30575 1726867569.00514: waiting for pending results... 30575 1726867569.00666: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30575 1726867569.00738: in run() - task 0affcac9-a3a5-e081-a588-0000000000f6 30575 1726867569.00748: variable 'ansible_search_path' from source: unknown 30575 1726867569.00752: variable 'ansible_search_path' from source: unknown 30575 1726867569.00780: calling self._execute() 30575 1726867569.00834: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.00837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.00849: variable 'omit' from source: magic vars 30575 1726867569.01100: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.01116: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.01119: variable 'omit' from source: magic vars 30575 1726867569.01153: variable 'omit' from source: magic vars 30575 1726867569.01226: variable '_current_interfaces' from source: set_fact 30575 1726867569.01315: variable 'omit' from source: magic vars 30575 1726867569.01348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867569.01374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867569.01390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867569.01404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.01413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.01438: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867569.01449: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.01452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.01519: Set connection var ansible_pipelining to False 30575 1726867569.01522: Set connection var ansible_shell_type to sh 30575 1726867569.01528: Set connection var ansible_shell_executable to /bin/sh 30575 1726867569.01532: Set connection var ansible_timeout to 10 30575 1726867569.01537: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867569.01543: Set connection var ansible_connection to ssh 30575 1726867569.01564: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.01568: variable 'ansible_connection' from source: unknown 30575 1726867569.01570: variable 'ansible_module_compression' from source: unknown 30575 1726867569.01572: variable 'ansible_shell_type' from source: unknown 30575 1726867569.01575: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.01579: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.01581: variable 'ansible_pipelining' from source: unknown 30575 1726867569.01583: variable 'ansible_timeout' from source: unknown 30575 1726867569.01588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.01687: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867569.01694: variable 'omit' from source: magic vars 30575 1726867569.01701: starting attempt loop 30575 1726867569.01703: running the handler 30575 1726867569.01712: handler run complete 30575 1726867569.01721: attempt loop complete, returning result 30575 1726867569.01726: _execute() done 30575 1726867569.01729: dumping result to json 30575 1726867569.01731: done dumping result, returning 30575 1726867569.01734: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-e081-a588-0000000000f6] 30575 1726867569.01739: sending task result for task 0affcac9-a3a5-e081-a588-0000000000f6 30575 1726867569.01813: done sending task result for task 0affcac9-a3a5-e081-a588-0000000000f6 30575 1726867569.01816: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30575 1726867569.01894: no more pending results, returning what we have 30575 1726867569.01897: results queue empty 30575 1726867569.01898: checking for any_errors_fatal 30575 1726867569.01902: done checking for any_errors_fatal 30575 1726867569.01903: checking for max_fail_percentage 30575 1726867569.01904: done checking for max_fail_percentage 30575 1726867569.01905: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.01906: done checking to see if all hosts have failed 30575 1726867569.01907: getting the remaining hosts for this loop 30575 1726867569.01908: done getting the remaining hosts for this loop 30575 1726867569.01911: getting the next task for host managed_node3 30575 1726867569.01917: done getting next task for host managed_node3 30575 1726867569.01919: ^ task is: TASK: Show current_interfaces 30575 1726867569.01927: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.01930: getting variables 30575 1726867569.01931: in VariableManager get_vars() 30575 1726867569.01954: Calling all_inventory to load vars for managed_node3 30575 1726867569.01956: Calling groups_inventory to load vars for managed_node3 30575 1726867569.01959: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.01966: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.01968: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.01970: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.02099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.02210: done with get_vars() 30575 1726867569.02217: done getting variables 30575 1726867569.02255: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:26:09 -0400 (0:00:00.019) 0:00:04.400 ****** 30575 1726867569.02274: entering _queue_task() for managed_node3/debug 30575 1726867569.02452: worker is 1 (out of 1 available) 30575 1726867569.02465: exiting _queue_task() for managed_node3/debug 30575 1726867569.02479: done queuing things up, now waiting for results queue to drain 30575 1726867569.02481: waiting for pending results... 30575 1726867569.02616: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30575 1726867569.02671: in run() - task 0affcac9-a3a5-e081-a588-0000000000bb 30575 1726867569.02683: variable 'ansible_search_path' from source: unknown 30575 1726867569.02687: variable 'ansible_search_path' from source: unknown 30575 1726867569.02712: calling self._execute() 30575 1726867569.02763: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.02767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.02774: variable 'omit' from source: magic vars 30575 1726867569.03009: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.03017: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.03022: variable 'omit' from source: magic vars 30575 1726867569.03054: variable 'omit' from source: magic vars 30575 1726867569.03119: variable 'current_interfaces' from source: set_fact 30575 1726867569.03142: variable 'omit' from source: magic vars 30575 1726867569.03168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867569.03194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867569.03208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867569.03220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.03230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.03252: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867569.03255: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.03258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.03326: Set connection var ansible_pipelining to False 30575 1726867569.03330: Set connection var ansible_shell_type to sh 30575 1726867569.03332: Set connection var ansible_shell_executable to /bin/sh 30575 1726867569.03335: Set connection var ansible_timeout to 10 30575 1726867569.03341: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867569.03347: Set connection var ansible_connection to ssh 30575 1726867569.03365: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.03368: variable 'ansible_connection' from source: unknown 30575 1726867569.03371: variable 'ansible_module_compression' from source: unknown 30575 1726867569.03373: variable 'ansible_shell_type' from source: unknown 30575 1726867569.03375: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.03379: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.03382: variable 'ansible_pipelining' from source: unknown 30575 1726867569.03384: variable 'ansible_timeout' from source: unknown 30575 1726867569.03394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.03480: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867569.03488: variable 'omit' from source: magic vars 30575 1726867569.03493: starting attempt loop 30575 1726867569.03497: running the handler 30575 1726867569.03530: handler run complete 30575 1726867569.03539: attempt loop complete, returning result 30575 1726867569.03542: _execute() done 30575 1726867569.03545: dumping result to json 30575 1726867569.03547: done dumping result, returning 30575 1726867569.03554: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-e081-a588-0000000000bb] 30575 1726867569.03557: sending task result for task 0affcac9-a3a5-e081-a588-0000000000bb 30575 1726867569.03634: done sending task result for task 0affcac9-a3a5-e081-a588-0000000000bb 30575 1726867569.03637: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30575 1726867569.03711: no more pending results, returning what we have 30575 1726867569.03714: results queue empty 30575 1726867569.03715: checking for any_errors_fatal 30575 1726867569.03719: done checking for any_errors_fatal 30575 1726867569.03719: checking for max_fail_percentage 30575 1726867569.03720: done checking for max_fail_percentage 30575 1726867569.03721: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.03722: done checking to see if all hosts have failed 30575 1726867569.03725: getting the remaining hosts for this loop 30575 1726867569.03726: done getting the remaining hosts for this loop 30575 1726867569.03729: getting the next task for host managed_node3 30575 1726867569.03735: done getting next task for host managed_node3 30575 1726867569.03738: ^ task is: TASK: Setup 30575 1726867569.03740: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.03743: getting variables 30575 1726867569.03743: in VariableManager get_vars() 30575 1726867569.03764: Calling all_inventory to load vars for managed_node3 30575 1726867569.03765: Calling groups_inventory to load vars for managed_node3 30575 1726867569.03767: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.03773: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.03775: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.03776: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.03880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.03991: done with get_vars() 30575 1726867569.03997: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:26:09 -0400 (0:00:00.017) 0:00:04.418 ****** 30575 1726867569.04055: entering _queue_task() for managed_node3/include_tasks 30575 1726867569.04225: worker is 1 (out of 1 available) 30575 1726867569.04238: exiting _queue_task() for managed_node3/include_tasks 30575 1726867569.04249: done queuing things up, now waiting for results queue to drain 30575 1726867569.04251: waiting for pending results... 30575 1726867569.04375: running TaskExecutor() for managed_node3/TASK: Setup 30575 1726867569.04431: in run() - task 0affcac9-a3a5-e081-a588-000000000094 30575 1726867569.04441: variable 'ansible_search_path' from source: unknown 30575 1726867569.04444: variable 'ansible_search_path' from source: unknown 30575 1726867569.04478: variable 'lsr_setup' from source: include params 30575 1726867569.04608: variable 'lsr_setup' from source: include params 30575 1726867569.04655: variable 'omit' from source: magic vars 30575 1726867569.04782: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.04784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.04786: variable 'omit' from source: magic vars 30575 1726867569.04937: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.04945: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.04951: variable 'item' from source: unknown 30575 1726867569.04996: variable 'item' from source: unknown 30575 1726867569.05019: variable 'item' from source: unknown 30575 1726867569.05065: variable 'item' from source: unknown 30575 1726867569.05170: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.05173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.05176: variable 'omit' from source: magic vars 30575 1726867569.05254: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.05258: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.05263: variable 'item' from source: unknown 30575 1726867569.05310: variable 'item' from source: unknown 30575 1726867569.05330: variable 'item' from source: unknown 30575 1726867569.05370: variable 'item' from source: unknown 30575 1726867569.05431: dumping result to json 30575 1726867569.05434: done dumping result, returning 30575 1726867569.05436: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-e081-a588-000000000094] 30575 1726867569.05438: sending task result for task 0affcac9-a3a5-e081-a588-000000000094 30575 1726867569.05470: done sending task result for task 0affcac9-a3a5-e081-a588-000000000094 30575 1726867569.05473: WORKER PROCESS EXITING 30575 1726867569.05497: no more pending results, returning what we have 30575 1726867569.05502: in VariableManager get_vars() 30575 1726867569.05533: Calling all_inventory to load vars for managed_node3 30575 1726867569.05535: Calling groups_inventory to load vars for managed_node3 30575 1726867569.05538: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.05547: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.05549: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.05552: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.05693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.05801: done with get_vars() 30575 1726867569.05806: variable 'ansible_search_path' from source: unknown 30575 1726867569.05806: variable 'ansible_search_path' from source: unknown 30575 1726867569.05835: variable 'ansible_search_path' from source: unknown 30575 1726867569.05836: variable 'ansible_search_path' from source: unknown 30575 1726867569.05852: we have included files to process 30575 1726867569.05853: generating all_blocks data 30575 1726867569.05854: done generating all_blocks data 30575 1726867569.05856: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30575 1726867569.05856: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30575 1726867569.05858: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30575 1726867569.05995: done processing included file 30575 1726867569.05997: iterating over new_blocks loaded from include file 30575 1726867569.05997: in VariableManager get_vars() 30575 1726867569.06005: done with get_vars() 30575 1726867569.06006: filtering new block on tags 30575 1726867569.06020: done filtering new block on tags 30575 1726867569.06021: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 => (item=tasks/delete_interface.yml) 30575 1726867569.06027: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867569.06028: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867569.06031: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867569.06102: in VariableManager get_vars() 30575 1726867569.06113: done with get_vars() 30575 1726867569.06192: done processing included file 30575 1726867569.06194: iterating over new_blocks loaded from include file 30575 1726867569.06194: in VariableManager get_vars() 30575 1726867569.06202: done with get_vars() 30575 1726867569.06203: filtering new block on tags 30575 1726867569.06221: done filtering new block on tags 30575 1726867569.06222: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item=tasks/assert_device_absent.yml) 30575 1726867569.06227: extending task lists for all hosts with included blocks 30575 1726867569.06587: done extending task lists 30575 1726867569.06588: done processing included files 30575 1726867569.06589: results queue empty 30575 1726867569.06589: checking for any_errors_fatal 30575 1726867569.06591: done checking for any_errors_fatal 30575 1726867569.06591: checking for max_fail_percentage 30575 1726867569.06592: done checking for max_fail_percentage 30575 1726867569.06592: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.06593: done checking to see if all hosts have failed 30575 1726867569.06593: getting the remaining hosts for this loop 30575 1726867569.06594: done getting the remaining hosts for this loop 30575 1726867569.06596: getting the next task for host managed_node3 30575 1726867569.06598: done getting next task for host managed_node3 30575 1726867569.06600: ^ task is: TASK: Remove test interface if necessary 30575 1726867569.06601: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.06603: getting variables 30575 1726867569.06604: in VariableManager get_vars() 30575 1726867569.06612: Calling all_inventory to load vars for managed_node3 30575 1726867569.06614: Calling groups_inventory to load vars for managed_node3 30575 1726867569.06615: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.06618: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.06620: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.06621: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.06702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.06808: done with get_vars() 30575 1726867569.06813: done getting variables 30575 1726867569.06841: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 17:26:09 -0400 (0:00:00.028) 0:00:04.446 ****** 30575 1726867569.06858: entering _queue_task() for managed_node3/command 30575 1726867569.07029: worker is 1 (out of 1 available) 30575 1726867569.07043: exiting _queue_task() for managed_node3/command 30575 1726867569.07056: done queuing things up, now waiting for results queue to drain 30575 1726867569.07057: waiting for pending results... 30575 1726867569.07185: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 30575 1726867569.07241: in run() - task 0affcac9-a3a5-e081-a588-00000000011b 30575 1726867569.07252: variable 'ansible_search_path' from source: unknown 30575 1726867569.07255: variable 'ansible_search_path' from source: unknown 30575 1726867569.07281: calling self._execute() 30575 1726867569.07332: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.07336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.07344: variable 'omit' from source: magic vars 30575 1726867569.07570: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.07580: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.07586: variable 'omit' from source: magic vars 30575 1726867569.07615: variable 'omit' from source: magic vars 30575 1726867569.07682: variable 'interface' from source: play vars 30575 1726867569.07696: variable 'omit' from source: magic vars 30575 1726867569.07726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867569.07751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867569.07765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867569.07779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.07789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.07809: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867569.07812: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.07817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.07882: Set connection var ansible_pipelining to False 30575 1726867569.07885: Set connection var ansible_shell_type to sh 30575 1726867569.07890: Set connection var ansible_shell_executable to /bin/sh 30575 1726867569.07895: Set connection var ansible_timeout to 10 30575 1726867569.07901: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867569.07907: Set connection var ansible_connection to ssh 30575 1726867569.07926: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.07929: variable 'ansible_connection' from source: unknown 30575 1726867569.07932: variable 'ansible_module_compression' from source: unknown 30575 1726867569.07934: variable 'ansible_shell_type' from source: unknown 30575 1726867569.07936: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.07938: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.07940: variable 'ansible_pipelining' from source: unknown 30575 1726867569.07942: variable 'ansible_timeout' from source: unknown 30575 1726867569.07944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.08035: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867569.08042: variable 'omit' from source: magic vars 30575 1726867569.08047: starting attempt loop 30575 1726867569.08050: running the handler 30575 1726867569.08063: _low_level_execute_command(): starting 30575 1726867569.08074: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867569.08576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.08582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.08585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.08587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.08633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867569.08645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.08700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.10327: stdout chunk (state=3): >>>/root <<< 30575 1726867569.10418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.10443: stderr chunk (state=3): >>><<< 30575 1726867569.10447: stdout chunk (state=3): >>><<< 30575 1726867569.10469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.10481: _low_level_execute_command(): starting 30575 1726867569.10486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839 `" && echo ansible-tmp-1726867569.1046698-30770-45066947091839="` echo /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839 `" ) && sleep 0' 30575 1726867569.10901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.10904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867569.10906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867569.10915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.10959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867569.10963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.11014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.12884: stdout chunk (state=3): >>>ansible-tmp-1726867569.1046698-30770-45066947091839=/root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839 <<< 30575 1726867569.12989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.13015: stderr chunk (state=3): >>><<< 30575 1726867569.13018: stdout chunk (state=3): >>><<< 30575 1726867569.13031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867569.1046698-30770-45066947091839=/root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.13052: variable 'ansible_module_compression' from source: unknown 30575 1726867569.13091: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867569.13123: variable 'ansible_facts' from source: unknown 30575 1726867569.13175: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/AnsiballZ_command.py 30575 1726867569.13266: Sending initial data 30575 1726867569.13270: Sent initial data (155 bytes) 30575 1726867569.13683: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.13686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.13689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867569.13691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867569.13693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.13755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867569.13759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.13818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.15348: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867569.15394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867569.15442: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpn5oasux5 /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/AnsiballZ_command.py <<< 30575 1726867569.15445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/AnsiballZ_command.py" <<< 30575 1726867569.15482: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpn5oasux5" to remote "/root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/AnsiballZ_command.py" <<< 30575 1726867569.16196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.16237: stderr chunk (state=3): >>><<< 30575 1726867569.16373: stdout chunk (state=3): >>><<< 30575 1726867569.16379: done transferring module to remote 30575 1726867569.16381: _low_level_execute_command(): starting 30575 1726867569.16384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/ /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/AnsiballZ_command.py && sleep 0' 30575 1726867569.16891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867569.16908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867569.16934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.17040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867569.17067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867569.17088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.17171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.18895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.18933: stderr chunk (state=3): >>><<< 30575 1726867569.18941: stdout chunk (state=3): >>><<< 30575 1726867569.18958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.18966: _low_level_execute_command(): starting 30575 1726867569.18974: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/AnsiballZ_command.py && sleep 0' 30575 1726867569.19539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867569.19553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867569.19567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.19588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867569.19605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867569.19618: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867569.19636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.19732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867569.19758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.19846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.35770: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 17:26:09.346458", "end": "2024-09-20 17:26:09.354188", "delta": "0:00:00.007730", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867569.37101: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 30575 1726867569.37271: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867569.37275: stderr chunk (state=3): >>><<< 30575 1726867569.37279: stdout chunk (state=3): >>><<< 30575 1726867569.37281: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 17:26:09.346458", "end": "2024-09-20 17:26:09.354188", "delta": "0:00:00.007730", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867569.37360: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867569.37363: _low_level_execute_command(): starting 30575 1726867569.37365: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867569.1046698-30770-45066947091839/ > /dev/null 2>&1 && sleep 0' 30575 1726867569.37897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867569.37901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867569.37903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867569.37910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867569.37912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.37952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867569.37959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.38006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.39832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.39861: stderr chunk (state=3): >>><<< 30575 1726867569.39864: stdout chunk (state=3): >>><<< 30575 1726867569.39909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.39912: handler run complete 30575 1726867569.39958: Evaluated conditional (False): False 30575 1726867569.39961: attempt loop complete, returning result 30575 1726867569.39963: _execute() done 30575 1726867569.39965: dumping result to json 30575 1726867569.39967: done dumping result, returning 30575 1726867569.39969: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [0affcac9-a3a5-e081-a588-00000000011b] 30575 1726867569.39971: sending task result for task 0affcac9-a3a5-e081-a588-00000000011b fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007730", "end": "2024-09-20 17:26:09.354188", "rc": 1, "start": "2024-09-20 17:26:09.346458" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30575 1726867569.40119: no more pending results, returning what we have 30575 1726867569.40124: results queue empty 30575 1726867569.40124: checking for any_errors_fatal 30575 1726867569.40126: done checking for any_errors_fatal 30575 1726867569.40126: checking for max_fail_percentage 30575 1726867569.40128: done checking for max_fail_percentage 30575 1726867569.40129: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.40130: done checking to see if all hosts have failed 30575 1726867569.40130: getting the remaining hosts for this loop 30575 1726867569.40131: done getting the remaining hosts for this loop 30575 1726867569.40135: getting the next task for host managed_node3 30575 1726867569.40143: done getting next task for host managed_node3 30575 1726867569.40146: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867569.40150: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.40155: getting variables 30575 1726867569.40156: in VariableManager get_vars() 30575 1726867569.40185: Calling all_inventory to load vars for managed_node3 30575 1726867569.40188: Calling groups_inventory to load vars for managed_node3 30575 1726867569.40191: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.40201: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.40203: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.40206: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.40374: done sending task result for task 0affcac9-a3a5-e081-a588-00000000011b 30575 1726867569.40379: WORKER PROCESS EXITING 30575 1726867569.40389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.40505: done with get_vars() 30575 1726867569.40514: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:26:09 -0400 (0:00:00.337) 0:00:04.783 ****** 30575 1726867569.40579: entering _queue_task() for managed_node3/include_tasks 30575 1726867569.40761: worker is 1 (out of 1 available) 30575 1726867569.40773: exiting _queue_task() for managed_node3/include_tasks 30575 1726867569.40788: done queuing things up, now waiting for results queue to drain 30575 1726867569.40789: waiting for pending results... 30575 1726867569.40938: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867569.41001: in run() - task 0affcac9-a3a5-e081-a588-00000000011f 30575 1726867569.41016: variable 'ansible_search_path' from source: unknown 30575 1726867569.41019: variable 'ansible_search_path' from source: unknown 30575 1726867569.41047: calling self._execute() 30575 1726867569.41101: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.41105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.41114: variable 'omit' from source: magic vars 30575 1726867569.41366: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.41375: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.41382: _execute() done 30575 1726867569.41387: dumping result to json 30575 1726867569.41390: done dumping result, returning 30575 1726867569.41396: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-00000000011f] 30575 1726867569.41401: sending task result for task 0affcac9-a3a5-e081-a588-00000000011f 30575 1726867569.41479: done sending task result for task 0affcac9-a3a5-e081-a588-00000000011f 30575 1726867569.41482: WORKER PROCESS EXITING 30575 1726867569.41506: no more pending results, returning what we have 30575 1726867569.41510: in VariableManager get_vars() 30575 1726867569.41542: Calling all_inventory to load vars for managed_node3 30575 1726867569.41544: Calling groups_inventory to load vars for managed_node3 30575 1726867569.41548: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.41557: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.41559: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.41561: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.41683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.41795: done with get_vars() 30575 1726867569.41802: variable 'ansible_search_path' from source: unknown 30575 1726867569.41803: variable 'ansible_search_path' from source: unknown 30575 1726867569.41809: variable 'item' from source: include params 30575 1726867569.41898: variable 'item' from source: include params 30575 1726867569.41940: we have included files to process 30575 1726867569.41941: generating all_blocks data 30575 1726867569.41946: done generating all_blocks data 30575 1726867569.41950: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867569.41951: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867569.41954: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867569.42163: done processing included file 30575 1726867569.42165: iterating over new_blocks loaded from include file 30575 1726867569.42167: in VariableManager get_vars() 30575 1726867569.42182: done with get_vars() 30575 1726867569.42184: filtering new block on tags 30575 1726867569.42207: done filtering new block on tags 30575 1726867569.42209: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867569.42213: extending task lists for all hosts with included blocks 30575 1726867569.42365: done extending task lists 30575 1726867569.42366: done processing included files 30575 1726867569.42367: results queue empty 30575 1726867569.42367: checking for any_errors_fatal 30575 1726867569.42371: done checking for any_errors_fatal 30575 1726867569.42372: checking for max_fail_percentage 30575 1726867569.42373: done checking for max_fail_percentage 30575 1726867569.42373: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.42374: done checking to see if all hosts have failed 30575 1726867569.42375: getting the remaining hosts for this loop 30575 1726867569.42376: done getting the remaining hosts for this loop 30575 1726867569.42380: getting the next task for host managed_node3 30575 1726867569.42384: done getting next task for host managed_node3 30575 1726867569.42387: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867569.42390: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.42392: getting variables 30575 1726867569.42393: in VariableManager get_vars() 30575 1726867569.42401: Calling all_inventory to load vars for managed_node3 30575 1726867569.42403: Calling groups_inventory to load vars for managed_node3 30575 1726867569.42404: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.42409: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.42411: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.42414: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.42571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.42753: done with get_vars() 30575 1726867569.42762: done getting variables 30575 1726867569.42870: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:26:09 -0400 (0:00:00.023) 0:00:04.806 ****** 30575 1726867569.42897: entering _queue_task() for managed_node3/stat 30575 1726867569.43116: worker is 1 (out of 1 available) 30575 1726867569.43128: exiting _queue_task() for managed_node3/stat 30575 1726867569.43141: done queuing things up, now waiting for results queue to drain 30575 1726867569.43142: waiting for pending results... 30575 1726867569.43399: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867569.43465: in run() - task 0affcac9-a3a5-e081-a588-00000000016e 30575 1726867569.43475: variable 'ansible_search_path' from source: unknown 30575 1726867569.43481: variable 'ansible_search_path' from source: unknown 30575 1726867569.43505: calling self._execute() 30575 1726867569.43561: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.43564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.43573: variable 'omit' from source: magic vars 30575 1726867569.43812: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.43821: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.43828: variable 'omit' from source: magic vars 30575 1726867569.43864: variable 'omit' from source: magic vars 30575 1726867569.43935: variable 'interface' from source: play vars 30575 1726867569.43950: variable 'omit' from source: magic vars 30575 1726867569.43981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867569.44006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867569.44021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867569.44035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.44045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.44069: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867569.44079: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.44081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.44146: Set connection var ansible_pipelining to False 30575 1726867569.44149: Set connection var ansible_shell_type to sh 30575 1726867569.44153: Set connection var ansible_shell_executable to /bin/sh 30575 1726867569.44159: Set connection var ansible_timeout to 10 30575 1726867569.44164: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867569.44170: Set connection var ansible_connection to ssh 30575 1726867569.44193: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.44197: variable 'ansible_connection' from source: unknown 30575 1726867569.44200: variable 'ansible_module_compression' from source: unknown 30575 1726867569.44202: variable 'ansible_shell_type' from source: unknown 30575 1726867569.44205: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.44207: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.44209: variable 'ansible_pipelining' from source: unknown 30575 1726867569.44211: variable 'ansible_timeout' from source: unknown 30575 1726867569.44215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.44355: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867569.44363: variable 'omit' from source: magic vars 30575 1726867569.44368: starting attempt loop 30575 1726867569.44371: running the handler 30575 1726867569.44383: _low_level_execute_command(): starting 30575 1726867569.44396: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867569.44899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.44904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.44952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867569.44973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.45059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.46672: stdout chunk (state=3): >>>/root <<< 30575 1726867569.46773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.46801: stderr chunk (state=3): >>><<< 30575 1726867569.46804: stdout chunk (state=3): >>><<< 30575 1726867569.46822: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.46833: _low_level_execute_command(): starting 30575 1726867569.46839: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167 `" && echo ansible-tmp-1726867569.4682128-30796-208530951369167="` echo /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167 `" ) && sleep 0' 30575 1726867569.47251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.47254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.47257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.47266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.47308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867569.47312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.47365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.49260: stdout chunk (state=3): >>>ansible-tmp-1726867569.4682128-30796-208530951369167=/root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167 <<< 30575 1726867569.49348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.49370: stderr chunk (state=3): >>><<< 30575 1726867569.49374: stdout chunk (state=3): >>><<< 30575 1726867569.49388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867569.4682128-30796-208530951369167=/root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.49424: variable 'ansible_module_compression' from source: unknown 30575 1726867569.49469: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867569.49498: variable 'ansible_facts' from source: unknown 30575 1726867569.49561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/AnsiballZ_stat.py 30575 1726867569.49653: Sending initial data 30575 1726867569.49657: Sent initial data (153 bytes) 30575 1726867569.50056: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.50059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.50061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.50063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.50115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867569.50125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.50163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.51698: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30575 1726867569.51701: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867569.51739: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867569.51782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp5bejylu5 /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/AnsiballZ_stat.py <<< 30575 1726867569.51785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/AnsiballZ_stat.py" <<< 30575 1726867569.51826: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp5bejylu5" to remote "/root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/AnsiballZ_stat.py" <<< 30575 1726867569.51829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/AnsiballZ_stat.py" <<< 30575 1726867569.52468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.52499: stderr chunk (state=3): >>><<< 30575 1726867569.52642: stdout chunk (state=3): >>><<< 30575 1726867569.52646: done transferring module to remote 30575 1726867569.52648: _low_level_execute_command(): starting 30575 1726867569.52651: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/ /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/AnsiballZ_stat.py && sleep 0' 30575 1726867569.53114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.53129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.53140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.53189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867569.53219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.53261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.54983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.55004: stderr chunk (state=3): >>><<< 30575 1726867569.55008: stdout chunk (state=3): >>><<< 30575 1726867569.55018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.55021: _low_level_execute_command(): starting 30575 1726867569.55028: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/AnsiballZ_stat.py && sleep 0' 30575 1726867569.55417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867569.55420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867569.55422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867569.55428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867569.55430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867569.55471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867569.55519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.55586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.70835: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867569.72183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867569.72187: stdout chunk (state=3): >>><<< 30575 1726867569.72189: stderr chunk (state=3): >>><<< 30575 1726867569.72297: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867569.72303: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867569.72306: _low_level_execute_command(): starting 30575 1726867569.72308: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867569.4682128-30796-208530951369167/ > /dev/null 2>&1 && sleep 0' 30575 1726867569.73414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867569.73431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867569.73443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867569.73473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867569.73581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867569.73605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867569.73690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867569.75581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867569.75585: stdout chunk (state=3): >>><<< 30575 1726867569.75587: stderr chunk (state=3): >>><<< 30575 1726867569.75783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867569.75787: handler run complete 30575 1726867569.75789: attempt loop complete, returning result 30575 1726867569.75791: _execute() done 30575 1726867569.75794: dumping result to json 30575 1726867569.75796: done dumping result, returning 30575 1726867569.75798: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-00000000016e] 30575 1726867569.75800: sending task result for task 0affcac9-a3a5-e081-a588-00000000016e 30575 1726867569.75879: done sending task result for task 0affcac9-a3a5-e081-a588-00000000016e 30575 1726867569.75883: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867569.75949: no more pending results, returning what we have 30575 1726867569.75953: results queue empty 30575 1726867569.75954: checking for any_errors_fatal 30575 1726867569.75955: done checking for any_errors_fatal 30575 1726867569.75956: checking for max_fail_percentage 30575 1726867569.75958: done checking for max_fail_percentage 30575 1726867569.75959: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.75960: done checking to see if all hosts have failed 30575 1726867569.75961: getting the remaining hosts for this loop 30575 1726867569.75962: done getting the remaining hosts for this loop 30575 1726867569.75966: getting the next task for host managed_node3 30575 1726867569.75976: done getting next task for host managed_node3 30575 1726867569.75987: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30575 1726867569.75992: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.75997: getting variables 30575 1726867569.75999: in VariableManager get_vars() 30575 1726867569.76034: Calling all_inventory to load vars for managed_node3 30575 1726867569.76037: Calling groups_inventory to load vars for managed_node3 30575 1726867569.76041: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.76052: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.76055: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.76058: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.76496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.76712: done with get_vars() 30575 1726867569.76725: done getting variables 30575 1726867569.76825: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 30575 1726867569.76951: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:26:09 -0400 (0:00:00.340) 0:00:05.147 ****** 30575 1726867569.76991: entering _queue_task() for managed_node3/assert 30575 1726867569.76993: Creating lock for assert 30575 1726867569.77290: worker is 1 (out of 1 available) 30575 1726867569.77415: exiting _queue_task() for managed_node3/assert 30575 1726867569.77427: done queuing things up, now waiting for results queue to drain 30575 1726867569.77429: waiting for pending results... 30575 1726867569.77640: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30575 1726867569.77738: in run() - task 0affcac9-a3a5-e081-a588-000000000120 30575 1726867569.77743: variable 'ansible_search_path' from source: unknown 30575 1726867569.77746: variable 'ansible_search_path' from source: unknown 30575 1726867569.77765: calling self._execute() 30575 1726867569.77845: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.77858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.77879: variable 'omit' from source: magic vars 30575 1726867569.78240: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.78283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.78286: variable 'omit' from source: magic vars 30575 1726867569.78322: variable 'omit' from source: magic vars 30575 1726867569.78436: variable 'interface' from source: play vars 30575 1726867569.78500: variable 'omit' from source: magic vars 30575 1726867569.78510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867569.78555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867569.78581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867569.78607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.78631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867569.78717: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867569.78720: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.78722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.78793: Set connection var ansible_pipelining to False 30575 1726867569.78803: Set connection var ansible_shell_type to sh 30575 1726867569.78816: Set connection var ansible_shell_executable to /bin/sh 30575 1726867569.78840: Set connection var ansible_timeout to 10 30575 1726867569.78855: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867569.78935: Set connection var ansible_connection to ssh 30575 1726867569.78939: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.78942: variable 'ansible_connection' from source: unknown 30575 1726867569.78944: variable 'ansible_module_compression' from source: unknown 30575 1726867569.78945: variable 'ansible_shell_type' from source: unknown 30575 1726867569.78947: variable 'ansible_shell_executable' from source: unknown 30575 1726867569.78951: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.78953: variable 'ansible_pipelining' from source: unknown 30575 1726867569.78955: variable 'ansible_timeout' from source: unknown 30575 1726867569.78957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.79095: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867569.79110: variable 'omit' from source: magic vars 30575 1726867569.79121: starting attempt loop 30575 1726867569.79132: running the handler 30575 1726867569.79306: variable 'interface_stat' from source: set_fact 30575 1726867569.79371: Evaluated conditional (not interface_stat.stat.exists): True 30575 1726867569.79375: handler run complete 30575 1726867569.79378: attempt loop complete, returning result 30575 1726867569.79381: _execute() done 30575 1726867569.79383: dumping result to json 30575 1726867569.79385: done dumping result, returning 30575 1726867569.79387: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcac9-a3a5-e081-a588-000000000120] 30575 1726867569.79389: sending task result for task 0affcac9-a3a5-e081-a588-000000000120 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867569.79558: no more pending results, returning what we have 30575 1726867569.79562: results queue empty 30575 1726867569.79563: checking for any_errors_fatal 30575 1726867569.79572: done checking for any_errors_fatal 30575 1726867569.79573: checking for max_fail_percentage 30575 1726867569.79574: done checking for max_fail_percentage 30575 1726867569.79575: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.79576: done checking to see if all hosts have failed 30575 1726867569.79781: getting the remaining hosts for this loop 30575 1726867569.79785: done getting the remaining hosts for this loop 30575 1726867569.79789: getting the next task for host managed_node3 30575 1726867569.79797: done getting next task for host managed_node3 30575 1726867569.79800: ^ task is: TASK: Test 30575 1726867569.79803: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.79807: getting variables 30575 1726867569.79808: in VariableManager get_vars() 30575 1726867569.79840: Calling all_inventory to load vars for managed_node3 30575 1726867569.79843: Calling groups_inventory to load vars for managed_node3 30575 1726867569.79847: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.79857: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.79859: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.79862: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.80141: done sending task result for task 0affcac9-a3a5-e081-a588-000000000120 30575 1726867569.80144: WORKER PROCESS EXITING 30575 1726867569.80166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.80379: done with get_vars() 30575 1726867569.80390: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:26:09 -0400 (0:00:00.034) 0:00:05.182 ****** 30575 1726867569.80490: entering _queue_task() for managed_node3/include_tasks 30575 1726867569.80822: worker is 1 (out of 1 available) 30575 1726867569.80835: exiting _queue_task() for managed_node3/include_tasks 30575 1726867569.80846: done queuing things up, now waiting for results queue to drain 30575 1726867569.80847: waiting for pending results... 30575 1726867569.81081: running TaskExecutor() for managed_node3/TASK: Test 30575 1726867569.81181: in run() - task 0affcac9-a3a5-e081-a588-000000000095 30575 1726867569.81208: variable 'ansible_search_path' from source: unknown 30575 1726867569.81216: variable 'ansible_search_path' from source: unknown 30575 1726867569.81266: variable 'lsr_test' from source: include params 30575 1726867569.81529: variable 'lsr_test' from source: include params 30575 1726867569.81539: variable 'omit' from source: magic vars 30575 1726867569.81658: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.81672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.81689: variable 'omit' from source: magic vars 30575 1726867569.81920: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.81937: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.81947: variable 'item' from source: unknown 30575 1726867569.82019: variable 'item' from source: unknown 30575 1726867569.82057: variable 'item' from source: unknown 30575 1726867569.82130: variable 'item' from source: unknown 30575 1726867569.82272: dumping result to json 30575 1726867569.82275: done dumping result, returning 30575 1726867569.82385: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-e081-a588-000000000095] 30575 1726867569.82391: sending task result for task 0affcac9-a3a5-e081-a588-000000000095 30575 1726867569.82432: done sending task result for task 0affcac9-a3a5-e081-a588-000000000095 30575 1726867569.82436: WORKER PROCESS EXITING 30575 1726867569.82518: no more pending results, returning what we have 30575 1726867569.82525: in VariableManager get_vars() 30575 1726867569.82557: Calling all_inventory to load vars for managed_node3 30575 1726867569.82560: Calling groups_inventory to load vars for managed_node3 30575 1726867569.82563: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.82575: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.82580: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.82583: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.83108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.83308: done with get_vars() 30575 1726867569.83315: variable 'ansible_search_path' from source: unknown 30575 1726867569.83316: variable 'ansible_search_path' from source: unknown 30575 1726867569.83352: we have included files to process 30575 1726867569.83354: generating all_blocks data 30575 1726867569.83355: done generating all_blocks data 30575 1726867569.83357: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867569.83358: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867569.83360: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867569.83661: done processing included file 30575 1726867569.83663: iterating over new_blocks loaded from include file 30575 1726867569.83664: in VariableManager get_vars() 30575 1726867569.83678: done with get_vars() 30575 1726867569.83680: filtering new block on tags 30575 1726867569.83710: done filtering new block on tags 30575 1726867569.83712: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30575 1726867569.83716: extending task lists for all hosts with included blocks 30575 1726867569.84603: done extending task lists 30575 1726867569.84604: done processing included files 30575 1726867569.84605: results queue empty 30575 1726867569.84606: checking for any_errors_fatal 30575 1726867569.84608: done checking for any_errors_fatal 30575 1726867569.84609: checking for max_fail_percentage 30575 1726867569.84614: done checking for max_fail_percentage 30575 1726867569.84615: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.84616: done checking to see if all hosts have failed 30575 1726867569.84617: getting the remaining hosts for this loop 30575 1726867569.84618: done getting the remaining hosts for this loop 30575 1726867569.84620: getting the next task for host managed_node3 30575 1726867569.84627: done getting next task for host managed_node3 30575 1726867569.84629: ^ task is: TASK: Include network role 30575 1726867569.84631: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.84634: getting variables 30575 1726867569.84635: in VariableManager get_vars() 30575 1726867569.84642: Calling all_inventory to load vars for managed_node3 30575 1726867569.84644: Calling groups_inventory to load vars for managed_node3 30575 1726867569.84646: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.84651: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.84653: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.84656: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.84827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.85026: done with get_vars() 30575 1726867569.85035: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 17:26:09 -0400 (0:00:00.046) 0:00:05.228 ****** 30575 1726867569.85108: entering _queue_task() for managed_node3/include_role 30575 1726867569.85110: Creating lock for include_role 30575 1726867569.85483: worker is 1 (out of 1 available) 30575 1726867569.85494: exiting _queue_task() for managed_node3/include_role 30575 1726867569.85504: done queuing things up, now waiting for results queue to drain 30575 1726867569.85505: waiting for pending results... 30575 1726867569.85713: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867569.85810: in run() - task 0affcac9-a3a5-e081-a588-00000000018e 30575 1726867569.85814: variable 'ansible_search_path' from source: unknown 30575 1726867569.85817: variable 'ansible_search_path' from source: unknown 30575 1726867569.85847: calling self._execute() 30575 1726867569.85926: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.85947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.85981: variable 'omit' from source: magic vars 30575 1726867569.86317: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.86335: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.86345: _execute() done 30575 1726867569.86358: dumping result to json 30575 1726867569.86366: done dumping result, returning 30575 1726867569.86464: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-00000000018e] 30575 1726867569.86467: sending task result for task 0affcac9-a3a5-e081-a588-00000000018e 30575 1726867569.86545: done sending task result for task 0affcac9-a3a5-e081-a588-00000000018e 30575 1726867569.86548: WORKER PROCESS EXITING 30575 1726867569.86579: no more pending results, returning what we have 30575 1726867569.86584: in VariableManager get_vars() 30575 1726867569.86614: Calling all_inventory to load vars for managed_node3 30575 1726867569.86616: Calling groups_inventory to load vars for managed_node3 30575 1726867569.86620: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.86635: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.86638: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.86640: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.86985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.87192: done with get_vars() 30575 1726867569.87199: variable 'ansible_search_path' from source: unknown 30575 1726867569.87200: variable 'ansible_search_path' from source: unknown 30575 1726867569.87376: variable 'omit' from source: magic vars 30575 1726867569.87411: variable 'omit' from source: magic vars 30575 1726867569.87429: variable 'omit' from source: magic vars 30575 1726867569.87432: we have included files to process 30575 1726867569.87433: generating all_blocks data 30575 1726867569.87434: done generating all_blocks data 30575 1726867569.87435: processing included file: fedora.linux_system_roles.network 30575 1726867569.87453: in VariableManager get_vars() 30575 1726867569.87462: done with get_vars() 30575 1726867569.87518: in VariableManager get_vars() 30575 1726867569.87539: done with get_vars() 30575 1726867569.87635: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867569.87906: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867569.88043: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867569.88737: in VariableManager get_vars() 30575 1726867569.88756: done with get_vars() 30575 1726867569.89195: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867569.90904: iterating over new_blocks loaded from include file 30575 1726867569.90906: in VariableManager get_vars() 30575 1726867569.90921: done with get_vars() 30575 1726867569.90923: filtering new block on tags 30575 1726867569.91194: done filtering new block on tags 30575 1726867569.91197: in VariableManager get_vars() 30575 1726867569.91206: done with get_vars() 30575 1726867569.91207: filtering new block on tags 30575 1726867569.91217: done filtering new block on tags 30575 1726867569.91218: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867569.91222: extending task lists for all hosts with included blocks 30575 1726867569.91327: done extending task lists 30575 1726867569.91328: done processing included files 30575 1726867569.91328: results queue empty 30575 1726867569.91329: checking for any_errors_fatal 30575 1726867569.91331: done checking for any_errors_fatal 30575 1726867569.91331: checking for max_fail_percentage 30575 1726867569.91332: done checking for max_fail_percentage 30575 1726867569.91332: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.91333: done checking to see if all hosts have failed 30575 1726867569.91334: getting the remaining hosts for this loop 30575 1726867569.91334: done getting the remaining hosts for this loop 30575 1726867569.91336: getting the next task for host managed_node3 30575 1726867569.91339: done getting next task for host managed_node3 30575 1726867569.91341: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867569.91343: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.91350: getting variables 30575 1726867569.91351: in VariableManager get_vars() 30575 1726867569.91359: Calling all_inventory to load vars for managed_node3 30575 1726867569.91360: Calling groups_inventory to load vars for managed_node3 30575 1726867569.91362: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.91365: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.91366: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.91368: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.91467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.91587: done with get_vars() 30575 1726867569.91594: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:26:09 -0400 (0:00:00.065) 0:00:05.293 ****** 30575 1726867569.91641: entering _queue_task() for managed_node3/include_tasks 30575 1726867569.91858: worker is 1 (out of 1 available) 30575 1726867569.91869: exiting _queue_task() for managed_node3/include_tasks 30575 1726867569.91884: done queuing things up, now waiting for results queue to drain 30575 1726867569.91886: waiting for pending results... 30575 1726867569.92046: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867569.92122: in run() - task 0affcac9-a3a5-e081-a588-00000000020c 30575 1726867569.92134: variable 'ansible_search_path' from source: unknown 30575 1726867569.92138: variable 'ansible_search_path' from source: unknown 30575 1726867569.92167: calling self._execute() 30575 1726867569.92231: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.92234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.92246: variable 'omit' from source: magic vars 30575 1726867569.92501: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.92511: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.92516: _execute() done 30575 1726867569.92520: dumping result to json 30575 1726867569.92524: done dumping result, returning 30575 1726867569.92533: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-00000000020c] 30575 1726867569.92538: sending task result for task 0affcac9-a3a5-e081-a588-00000000020c 30575 1726867569.92627: done sending task result for task 0affcac9-a3a5-e081-a588-00000000020c 30575 1726867569.92630: WORKER PROCESS EXITING 30575 1726867569.92689: no more pending results, returning what we have 30575 1726867569.92693: in VariableManager get_vars() 30575 1726867569.92725: Calling all_inventory to load vars for managed_node3 30575 1726867569.92727: Calling groups_inventory to load vars for managed_node3 30575 1726867569.92729: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.92737: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.92739: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.92741: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.92871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.93069: done with get_vars() 30575 1726867569.93076: variable 'ansible_search_path' from source: unknown 30575 1726867569.93080: variable 'ansible_search_path' from source: unknown 30575 1726867569.93114: we have included files to process 30575 1726867569.93116: generating all_blocks data 30575 1726867569.93117: done generating all_blocks data 30575 1726867569.93120: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867569.93121: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867569.93123: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867569.93750: done processing included file 30575 1726867569.93752: iterating over new_blocks loaded from include file 30575 1726867569.93754: in VariableManager get_vars() 30575 1726867569.93775: done with get_vars() 30575 1726867569.93779: filtering new block on tags 30575 1726867569.93808: done filtering new block on tags 30575 1726867569.93811: in VariableManager get_vars() 30575 1726867569.93830: done with get_vars() 30575 1726867569.93832: filtering new block on tags 30575 1726867569.93874: done filtering new block on tags 30575 1726867569.93878: in VariableManager get_vars() 30575 1726867569.93899: done with get_vars() 30575 1726867569.93901: filtering new block on tags 30575 1726867569.93942: done filtering new block on tags 30575 1726867569.93944: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867569.93949: extending task lists for all hosts with included blocks 30575 1726867569.94903: done extending task lists 30575 1726867569.94904: done processing included files 30575 1726867569.94905: results queue empty 30575 1726867569.94906: checking for any_errors_fatal 30575 1726867569.94908: done checking for any_errors_fatal 30575 1726867569.94909: checking for max_fail_percentage 30575 1726867569.94909: done checking for max_fail_percentage 30575 1726867569.94910: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.94910: done checking to see if all hosts have failed 30575 1726867569.94911: getting the remaining hosts for this loop 30575 1726867569.94912: done getting the remaining hosts for this loop 30575 1726867569.94913: getting the next task for host managed_node3 30575 1726867569.94917: done getting next task for host managed_node3 30575 1726867569.94918: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867569.94921: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.94927: getting variables 30575 1726867569.94928: in VariableManager get_vars() 30575 1726867569.94936: Calling all_inventory to load vars for managed_node3 30575 1726867569.94937: Calling groups_inventory to load vars for managed_node3 30575 1726867569.94939: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.94942: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.94943: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.94945: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.95022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.95134: done with get_vars() 30575 1726867569.95141: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:26:09 -0400 (0:00:00.035) 0:00:05.329 ****** 30575 1726867569.95187: entering _queue_task() for managed_node3/setup 30575 1726867569.95372: worker is 1 (out of 1 available) 30575 1726867569.95387: exiting _queue_task() for managed_node3/setup 30575 1726867569.95399: done queuing things up, now waiting for results queue to drain 30575 1726867569.95400: waiting for pending results... 30575 1726867569.95555: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867569.95642: in run() - task 0affcac9-a3a5-e081-a588-000000000269 30575 1726867569.95654: variable 'ansible_search_path' from source: unknown 30575 1726867569.95658: variable 'ansible_search_path' from source: unknown 30575 1726867569.95690: calling self._execute() 30575 1726867569.95751: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867569.95755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867569.95763: variable 'omit' from source: magic vars 30575 1726867569.96182: variable 'ansible_distribution_major_version' from source: facts 30575 1726867569.96186: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867569.96307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867569.98193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867569.98234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867569.98262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867569.98290: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867569.98320: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867569.98376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867569.98401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867569.98419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867569.98447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867569.98457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867569.98497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867569.98515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867569.98532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867569.98557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867569.98568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867569.98667: variable '__network_required_facts' from source: role '' defaults 30575 1726867569.98674: variable 'ansible_facts' from source: unknown 30575 1726867569.98729: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867569.98733: when evaluation is False, skipping this task 30575 1726867569.98735: _execute() done 30575 1726867569.98737: dumping result to json 30575 1726867569.98740: done dumping result, returning 30575 1726867569.98743: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000000269] 30575 1726867569.98748: sending task result for task 0affcac9-a3a5-e081-a588-000000000269 30575 1726867569.98825: done sending task result for task 0affcac9-a3a5-e081-a588-000000000269 30575 1726867569.98828: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867569.98894: no more pending results, returning what we have 30575 1726867569.98897: results queue empty 30575 1726867569.98898: checking for any_errors_fatal 30575 1726867569.98899: done checking for any_errors_fatal 30575 1726867569.98899: checking for max_fail_percentage 30575 1726867569.98902: done checking for max_fail_percentage 30575 1726867569.98903: checking to see if all hosts have failed and the running result is not ok 30575 1726867569.98904: done checking to see if all hosts have failed 30575 1726867569.98904: getting the remaining hosts for this loop 30575 1726867569.98906: done getting the remaining hosts for this loop 30575 1726867569.98909: getting the next task for host managed_node3 30575 1726867569.98919: done getting next task for host managed_node3 30575 1726867569.98922: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867569.98930: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867569.98945: getting variables 30575 1726867569.98946: in VariableManager get_vars() 30575 1726867569.98976: Calling all_inventory to load vars for managed_node3 30575 1726867569.98979: Calling groups_inventory to load vars for managed_node3 30575 1726867569.98982: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867569.98989: Calling all_plugins_play to load vars for managed_node3 30575 1726867569.98991: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867569.98998: Calling groups_plugins_play to load vars for managed_node3 30575 1726867569.99188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867569.99396: done with get_vars() 30575 1726867569.99405: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:26:09 -0400 (0:00:00.043) 0:00:05.372 ****** 30575 1726867569.99500: entering _queue_task() for managed_node3/stat 30575 1726867569.99733: worker is 1 (out of 1 available) 30575 1726867569.99745: exiting _queue_task() for managed_node3/stat 30575 1726867569.99759: done queuing things up, now waiting for results queue to drain 30575 1726867569.99761: waiting for pending results... 30575 1726867570.00194: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867570.00198: in run() - task 0affcac9-a3a5-e081-a588-00000000026b 30575 1726867570.00201: variable 'ansible_search_path' from source: unknown 30575 1726867570.00203: variable 'ansible_search_path' from source: unknown 30575 1726867570.00227: calling self._execute() 30575 1726867570.00306: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867570.00325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867570.00342: variable 'omit' from source: magic vars 30575 1726867570.00656: variable 'ansible_distribution_major_version' from source: facts 30575 1726867570.00674: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867570.00863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867570.01108: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867570.01158: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867570.01197: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867570.01244: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867570.01358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867570.01390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867570.01429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867570.01533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867570.01574: variable '__network_is_ostree' from source: set_fact 30575 1726867570.01588: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867570.01603: when evaluation is False, skipping this task 30575 1726867570.01644: _execute() done 30575 1726867570.01648: dumping result to json 30575 1726867570.01651: done dumping result, returning 30575 1726867570.01653: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-00000000026b] 30575 1726867570.01681: sending task result for task 0affcac9-a3a5-e081-a588-00000000026b 30575 1726867570.01789: done sending task result for task 0affcac9-a3a5-e081-a588-00000000026b 30575 1726867570.01792: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867570.01879: no more pending results, returning what we have 30575 1726867570.01882: results queue empty 30575 1726867570.01883: checking for any_errors_fatal 30575 1726867570.01888: done checking for any_errors_fatal 30575 1726867570.01889: checking for max_fail_percentage 30575 1726867570.01890: done checking for max_fail_percentage 30575 1726867570.01891: checking to see if all hosts have failed and the running result is not ok 30575 1726867570.01891: done checking to see if all hosts have failed 30575 1726867570.01892: getting the remaining hosts for this loop 30575 1726867570.01893: done getting the remaining hosts for this loop 30575 1726867570.01896: getting the next task for host managed_node3 30575 1726867570.01902: done getting next task for host managed_node3 30575 1726867570.01905: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867570.01910: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867570.01921: getting variables 30575 1726867570.01922: in VariableManager get_vars() 30575 1726867570.01946: Calling all_inventory to load vars for managed_node3 30575 1726867570.01947: Calling groups_inventory to load vars for managed_node3 30575 1726867570.01949: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867570.01954: Calling all_plugins_play to load vars for managed_node3 30575 1726867570.01956: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867570.01957: Calling groups_plugins_play to load vars for managed_node3 30575 1726867570.02060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867570.02183: done with get_vars() 30575 1726867570.02190: done getting variables 30575 1726867570.02227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:26:10 -0400 (0:00:00.027) 0:00:05.400 ****** 30575 1726867570.02250: entering _queue_task() for managed_node3/set_fact 30575 1726867570.02427: worker is 1 (out of 1 available) 30575 1726867570.02439: exiting _queue_task() for managed_node3/set_fact 30575 1726867570.02451: done queuing things up, now waiting for results queue to drain 30575 1726867570.02453: waiting for pending results... 30575 1726867570.02597: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867570.02687: in run() - task 0affcac9-a3a5-e081-a588-00000000026c 30575 1726867570.02691: variable 'ansible_search_path' from source: unknown 30575 1726867570.02694: variable 'ansible_search_path' from source: unknown 30575 1726867570.02720: calling self._execute() 30575 1726867570.02773: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867570.02781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867570.02790: variable 'omit' from source: magic vars 30575 1726867570.03066: variable 'ansible_distribution_major_version' from source: facts 30575 1726867570.03074: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867570.03182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867570.03353: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867570.03383: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867570.03406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867570.03435: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867570.03583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867570.03587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867570.03609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867570.03625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867570.03710: variable '__network_is_ostree' from source: set_fact 30575 1726867570.03726: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867570.03736: when evaluation is False, skipping this task 30575 1726867570.03744: _execute() done 30575 1726867570.03751: dumping result to json 30575 1726867570.03760: done dumping result, returning 30575 1726867570.03771: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-00000000026c] 30575 1726867570.03784: sending task result for task 0affcac9-a3a5-e081-a588-00000000026c skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867570.03954: no more pending results, returning what we have 30575 1726867570.03958: results queue empty 30575 1726867570.03959: checking for any_errors_fatal 30575 1726867570.03963: done checking for any_errors_fatal 30575 1726867570.03964: checking for max_fail_percentage 30575 1726867570.03965: done checking for max_fail_percentage 30575 1726867570.03966: checking to see if all hosts have failed and the running result is not ok 30575 1726867570.03967: done checking to see if all hosts have failed 30575 1726867570.03968: getting the remaining hosts for this loop 30575 1726867570.03969: done getting the remaining hosts for this loop 30575 1726867570.03973: getting the next task for host managed_node3 30575 1726867570.03984: done getting next task for host managed_node3 30575 1726867570.03987: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867570.03992: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867570.04003: getting variables 30575 1726867570.04004: in VariableManager get_vars() 30575 1726867570.04033: Calling all_inventory to load vars for managed_node3 30575 1726867570.04035: Calling groups_inventory to load vars for managed_node3 30575 1726867570.04037: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867570.04045: Calling all_plugins_play to load vars for managed_node3 30575 1726867570.04047: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867570.04049: Calling groups_plugins_play to load vars for managed_node3 30575 1726867570.04254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867570.04520: done with get_vars() 30575 1726867570.04529: done getting variables 30575 1726867570.04551: done sending task result for task 0affcac9-a3a5-e081-a588-00000000026c 30575 1726867570.04554: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:26:10 -0400 (0:00:00.023) 0:00:05.423 ****** 30575 1726867570.04601: entering _queue_task() for managed_node3/service_facts 30575 1726867570.04603: Creating lock for service_facts 30575 1726867570.04779: worker is 1 (out of 1 available) 30575 1726867570.04791: exiting _queue_task() for managed_node3/service_facts 30575 1726867570.04803: done queuing things up, now waiting for results queue to drain 30575 1726867570.04804: waiting for pending results... 30575 1726867570.04963: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867570.05045: in run() - task 0affcac9-a3a5-e081-a588-00000000026e 30575 1726867570.05057: variable 'ansible_search_path' from source: unknown 30575 1726867570.05061: variable 'ansible_search_path' from source: unknown 30575 1726867570.05091: calling self._execute() 30575 1726867570.05148: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867570.05152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867570.05161: variable 'omit' from source: magic vars 30575 1726867570.05398: variable 'ansible_distribution_major_version' from source: facts 30575 1726867570.05411: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867570.05414: variable 'omit' from source: magic vars 30575 1726867570.05461: variable 'omit' from source: magic vars 30575 1726867570.05485: variable 'omit' from source: magic vars 30575 1726867570.05515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867570.05542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867570.05557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867570.05571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867570.05580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867570.05603: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867570.05606: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867570.05611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867570.05680: Set connection var ansible_pipelining to False 30575 1726867570.05684: Set connection var ansible_shell_type to sh 30575 1726867570.05689: Set connection var ansible_shell_executable to /bin/sh 30575 1726867570.05694: Set connection var ansible_timeout to 10 30575 1726867570.05699: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867570.05706: Set connection var ansible_connection to ssh 30575 1726867570.05725: variable 'ansible_shell_executable' from source: unknown 30575 1726867570.05728: variable 'ansible_connection' from source: unknown 30575 1726867570.05731: variable 'ansible_module_compression' from source: unknown 30575 1726867570.05733: variable 'ansible_shell_type' from source: unknown 30575 1726867570.05737: variable 'ansible_shell_executable' from source: unknown 30575 1726867570.05739: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867570.05741: variable 'ansible_pipelining' from source: unknown 30575 1726867570.05743: variable 'ansible_timeout' from source: unknown 30575 1726867570.05745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867570.05874: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867570.05884: variable 'omit' from source: magic vars 30575 1726867570.05889: starting attempt loop 30575 1726867570.05891: running the handler 30575 1726867570.05903: _low_level_execute_command(): starting 30575 1726867570.05974: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867570.06381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867570.06385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867570.06389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867570.06391: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867570.06441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867570.06444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867570.06450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867570.06501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867570.08167: stdout chunk (state=3): >>>/root <<< 30575 1726867570.08265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867570.08289: stderr chunk (state=3): >>><<< 30575 1726867570.08292: stdout chunk (state=3): >>><<< 30575 1726867570.08308: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867570.08318: _low_level_execute_command(): starting 30575 1726867570.08325: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345 `" && echo ansible-tmp-1726867570.083081-30830-75864258361345="` echo /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345 `" ) && sleep 0' 30575 1726867570.08728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867570.08732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867570.08740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867570.08742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867570.08744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867570.08783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867570.08793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867570.08840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867570.10729: stdout chunk (state=3): >>>ansible-tmp-1726867570.083081-30830-75864258361345=/root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345 <<< 30575 1726867570.10835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867570.10856: stderr chunk (state=3): >>><<< 30575 1726867570.10859: stdout chunk (state=3): >>><<< 30575 1726867570.10870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867570.083081-30830-75864258361345=/root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867570.10908: variable 'ansible_module_compression' from source: unknown 30575 1726867570.10938: ANSIBALLZ: Using lock for service_facts 30575 1726867570.10941: ANSIBALLZ: Acquiring lock 30575 1726867570.10945: ANSIBALLZ: Lock acquired: 140240643119952 30575 1726867570.10948: ANSIBALLZ: Creating module 30575 1726867570.19284: ANSIBALLZ: Writing module into payload 30575 1726867570.19315: ANSIBALLZ: Writing module 30575 1726867570.19339: ANSIBALLZ: Renaming module 30575 1726867570.19357: ANSIBALLZ: Done creating module 30575 1726867570.19381: variable 'ansible_facts' from source: unknown 30575 1726867570.19461: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/AnsiballZ_service_facts.py 30575 1726867570.19678: Sending initial data 30575 1726867570.19688: Sent initial data (160 bytes) 30575 1726867570.20294: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867570.20325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867570.20328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867570.20330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867570.20383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867570.21951: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867570.21993: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867570.22039: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4yr42ugf /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/AnsiballZ_service_facts.py <<< 30575 1726867570.22049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/AnsiballZ_service_facts.py" <<< 30575 1726867570.22085: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4yr42ugf" to remote "/root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/AnsiballZ_service_facts.py" <<< 30575 1726867570.22088: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/AnsiballZ_service_facts.py" <<< 30575 1726867570.22637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867570.22672: stderr chunk (state=3): >>><<< 30575 1726867570.22675: stdout chunk (state=3): >>><<< 30575 1726867570.22734: done transferring module to remote 30575 1726867570.22743: _low_level_execute_command(): starting 30575 1726867570.22748: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/ /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/AnsiballZ_service_facts.py && sleep 0' 30575 1726867570.23144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867570.23163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867570.23214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867570.23217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867570.23267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867570.25073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867570.25076: stdout chunk (state=3): >>><<< 30575 1726867570.25081: stderr chunk (state=3): >>><<< 30575 1726867570.25172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867570.25176: _low_level_execute_command(): starting 30575 1726867570.25184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/AnsiballZ_service_facts.py && sleep 0' 30575 1726867570.25755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867570.25759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867570.25775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867570.25838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867571.76920: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 30575 1726867571.76971: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 30575 1726867571.76998: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867571.78584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867571.78589: stdout chunk (state=3): >>><<< 30575 1726867571.78591: stderr chunk (state=3): >>><<< 30575 1726867571.78788: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867571.79331: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867571.79357: _low_level_execute_command(): starting 30575 1726867571.79367: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867570.083081-30830-75864258361345/ > /dev/null 2>&1 && sleep 0' 30575 1726867571.80034: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867571.80048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867571.80062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867571.80084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867571.80101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867571.80134: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867571.80192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867571.80245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867571.80268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867571.80284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867571.80362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867571.82683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867571.82687: stdout chunk (state=3): >>><<< 30575 1726867571.82689: stderr chunk (state=3): >>><<< 30575 1726867571.82692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867571.82694: handler run complete 30575 1726867571.82696: variable 'ansible_facts' from source: unknown 30575 1726867571.83016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867571.83715: variable 'ansible_facts' from source: unknown 30575 1726867571.83859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867571.84058: attempt loop complete, returning result 30575 1726867571.84069: _execute() done 30575 1726867571.84076: dumping result to json 30575 1726867571.84143: done dumping result, returning 30575 1726867571.84157: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-00000000026e] 30575 1726867571.84166: sending task result for task 0affcac9-a3a5-e081-a588-00000000026e ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867571.85008: done sending task result for task 0affcac9-a3a5-e081-a588-00000000026e 30575 1726867571.85011: WORKER PROCESS EXITING 30575 1726867571.85016: no more pending results, returning what we have 30575 1726867571.85018: results queue empty 30575 1726867571.85019: checking for any_errors_fatal 30575 1726867571.85020: done checking for any_errors_fatal 30575 1726867571.85021: checking for max_fail_percentage 30575 1726867571.85022: done checking for max_fail_percentage 30575 1726867571.85023: checking to see if all hosts have failed and the running result is not ok 30575 1726867571.85023: done checking to see if all hosts have failed 30575 1726867571.85024: getting the remaining hosts for this loop 30575 1726867571.85025: done getting the remaining hosts for this loop 30575 1726867571.85028: getting the next task for host managed_node3 30575 1726867571.85032: done getting next task for host managed_node3 30575 1726867571.85034: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867571.85038: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867571.85044: getting variables 30575 1726867571.85044: in VariableManager get_vars() 30575 1726867571.85062: Calling all_inventory to load vars for managed_node3 30575 1726867571.85064: Calling groups_inventory to load vars for managed_node3 30575 1726867571.85065: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867571.85071: Calling all_plugins_play to load vars for managed_node3 30575 1726867571.85075: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867571.85079: Calling groups_plugins_play to load vars for managed_node3 30575 1726867571.85366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867571.85829: done with get_vars() 30575 1726867571.85840: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:26:11 -0400 (0:00:01.813) 0:00:07.236 ****** 30575 1726867571.85923: entering _queue_task() for managed_node3/package_facts 30575 1726867571.85925: Creating lock for package_facts 30575 1726867571.86194: worker is 1 (out of 1 available) 30575 1726867571.86208: exiting _queue_task() for managed_node3/package_facts 30575 1726867571.86223: done queuing things up, now waiting for results queue to drain 30575 1726867571.86224: waiting for pending results... 30575 1726867571.86513: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867571.86885: in run() - task 0affcac9-a3a5-e081-a588-00000000026f 30575 1726867571.86889: variable 'ansible_search_path' from source: unknown 30575 1726867571.86893: variable 'ansible_search_path' from source: unknown 30575 1726867571.86900: calling self._execute() 30575 1726867571.87185: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867571.87189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867571.87192: variable 'omit' from source: magic vars 30575 1726867571.87826: variable 'ansible_distribution_major_version' from source: facts 30575 1726867571.87852: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867571.87864: variable 'omit' from source: magic vars 30575 1726867571.87982: variable 'omit' from source: magic vars 30575 1726867571.88019: variable 'omit' from source: magic vars 30575 1726867571.88067: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867571.88109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867571.88133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867571.88155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867571.88175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867571.88210: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867571.88219: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867571.88227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867571.88329: Set connection var ansible_pipelining to False 30575 1726867571.88337: Set connection var ansible_shell_type to sh 30575 1726867571.88346: Set connection var ansible_shell_executable to /bin/sh 30575 1726867571.88355: Set connection var ansible_timeout to 10 30575 1726867571.88364: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867571.88378: Set connection var ansible_connection to ssh 30575 1726867571.88408: variable 'ansible_shell_executable' from source: unknown 30575 1726867571.88415: variable 'ansible_connection' from source: unknown 30575 1726867571.88423: variable 'ansible_module_compression' from source: unknown 30575 1726867571.88429: variable 'ansible_shell_type' from source: unknown 30575 1726867571.88435: variable 'ansible_shell_executable' from source: unknown 30575 1726867571.88441: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867571.88448: variable 'ansible_pipelining' from source: unknown 30575 1726867571.88454: variable 'ansible_timeout' from source: unknown 30575 1726867571.88461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867571.88648: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867571.88662: variable 'omit' from source: magic vars 30575 1726867571.88671: starting attempt loop 30575 1726867571.88679: running the handler 30575 1726867571.88708: _low_level_execute_command(): starting 30575 1726867571.88711: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867571.89384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867571.89400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867571.89416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867571.89482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867571.89532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867571.89551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867571.89571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867571.89660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867571.91400: stdout chunk (state=3): >>>/root <<< 30575 1726867571.91511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867571.91550: stderr chunk (state=3): >>><<< 30575 1726867571.91573: stdout chunk (state=3): >>><<< 30575 1726867571.91606: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867571.91697: _low_level_execute_command(): starting 30575 1726867571.91701: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893 `" && echo ansible-tmp-1726867571.916112-30898-155716542745893="` echo /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893 `" ) && sleep 0' 30575 1726867571.92232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867571.92245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867571.92262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867571.92278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867571.92566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867571.92595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867571.92651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867571.94794: stdout chunk (state=3): >>>ansible-tmp-1726867571.916112-30898-155716542745893=/root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893 <<< 30575 1726867571.94849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867571.95005: stderr chunk (state=3): >>><<< 30575 1726867571.95009: stdout chunk (state=3): >>><<< 30575 1726867571.95183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867571.916112-30898-155716542745893=/root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867571.95186: variable 'ansible_module_compression' from source: unknown 30575 1726867571.95188: ANSIBALLZ: Using lock for package_facts 30575 1726867571.95190: ANSIBALLZ: Acquiring lock 30575 1726867571.95192: ANSIBALLZ: Lock acquired: 140240642863232 30575 1726867571.95194: ANSIBALLZ: Creating module 30575 1726867572.22906: ANSIBALLZ: Writing module into payload 30575 1726867572.23047: ANSIBALLZ: Writing module 30575 1726867572.23086: ANSIBALLZ: Renaming module 30575 1726867572.23099: ANSIBALLZ: Done creating module 30575 1726867572.23137: variable 'ansible_facts' from source: unknown 30575 1726867572.23318: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/AnsiballZ_package_facts.py 30575 1726867572.23563: Sending initial data 30575 1726867572.23572: Sent initial data (161 bytes) 30575 1726867572.24136: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867572.24151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867572.24168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867572.24252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867572.25926: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867572.25941: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30575 1726867572.25955: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30575 1726867572.25967: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30575 1726867572.25981: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 30575 1726867572.25996: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 30575 1726867572.26028: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867572.26060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867572.26132: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpon7u4dl1 /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/AnsiballZ_package_facts.py <<< 30575 1726867572.26150: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/AnsiballZ_package_facts.py" <<< 30575 1726867572.26189: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpon7u4dl1" to remote "/root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/AnsiballZ_package_facts.py" <<< 30575 1726867572.27817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867572.27886: stderr chunk (state=3): >>><<< 30575 1726867572.27900: stdout chunk (state=3): >>><<< 30575 1726867572.27948: done transferring module to remote 30575 1726867572.27972: _low_level_execute_command(): starting 30575 1726867572.27986: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/ /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/AnsiballZ_package_facts.py && sleep 0' 30575 1726867572.28815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867572.29000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867572.29003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867572.29015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867572.29256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867572.29329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867572.31282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867572.31285: stdout chunk (state=3): >>><<< 30575 1726867572.31288: stderr chunk (state=3): >>><<< 30575 1726867572.31290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867572.31292: _low_level_execute_command(): starting 30575 1726867572.31295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/AnsiballZ_package_facts.py && sleep 0' 30575 1726867572.31895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867572.31921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867572.31934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867572.31949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867572.32082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867572.76641: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30575 1726867572.76664: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30575 1726867572.76703: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30575 1726867572.76714: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867572.76752: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30575 1726867572.76766: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30575 1726867572.76779: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30575 1726867572.76791: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867572.76815: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30575 1726867572.76821: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867572.76847: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867572.78668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867572.78671: stdout chunk (state=3): >>><<< 30575 1726867572.78682: stderr chunk (state=3): >>><<< 30575 1726867572.78716: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867572.80321: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867572.80341: _low_level_execute_command(): starting 30575 1726867572.80345: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867571.916112-30898-155716542745893/ > /dev/null 2>&1 && sleep 0' 30575 1726867572.80763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867572.80766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867572.80768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867572.80771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867572.80773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867572.80828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867572.80831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867572.80833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867572.80874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867572.82728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867572.82752: stderr chunk (state=3): >>><<< 30575 1726867572.82755: stdout chunk (state=3): >>><<< 30575 1726867572.82767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867572.82773: handler run complete 30575 1726867572.83370: variable 'ansible_facts' from source: unknown 30575 1726867572.83981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867572.84971: variable 'ansible_facts' from source: unknown 30575 1726867572.85197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867572.85571: attempt loop complete, returning result 30575 1726867572.85581: _execute() done 30575 1726867572.85584: dumping result to json 30575 1726867572.85700: done dumping result, returning 30575 1726867572.85708: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-00000000026f] 30575 1726867572.85718: sending task result for task 0affcac9-a3a5-e081-a588-00000000026f 30575 1726867572.87776: done sending task result for task 0affcac9-a3a5-e081-a588-00000000026f 30575 1726867572.87782: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867572.87882: no more pending results, returning what we have 30575 1726867572.87885: results queue empty 30575 1726867572.87885: checking for any_errors_fatal 30575 1726867572.87889: done checking for any_errors_fatal 30575 1726867572.87890: checking for max_fail_percentage 30575 1726867572.87892: done checking for max_fail_percentage 30575 1726867572.87892: checking to see if all hosts have failed and the running result is not ok 30575 1726867572.87893: done checking to see if all hosts have failed 30575 1726867572.87894: getting the remaining hosts for this loop 30575 1726867572.87895: done getting the remaining hosts for this loop 30575 1726867572.87899: getting the next task for host managed_node3 30575 1726867572.87906: done getting next task for host managed_node3 30575 1726867572.87909: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867572.87915: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867572.87924: getting variables 30575 1726867572.87926: in VariableManager get_vars() 30575 1726867572.87953: Calling all_inventory to load vars for managed_node3 30575 1726867572.87955: Calling groups_inventory to load vars for managed_node3 30575 1726867572.87958: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867572.87966: Calling all_plugins_play to load vars for managed_node3 30575 1726867572.87968: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867572.87971: Calling groups_plugins_play to load vars for managed_node3 30575 1726867572.89247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867572.90933: done with get_vars() 30575 1726867572.90965: done getting variables 30575 1726867572.91026: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:26:12 -0400 (0:00:01.051) 0:00:08.288 ****** 30575 1726867572.91063: entering _queue_task() for managed_node3/debug 30575 1726867572.91419: worker is 1 (out of 1 available) 30575 1726867572.91431: exiting _queue_task() for managed_node3/debug 30575 1726867572.91443: done queuing things up, now waiting for results queue to drain 30575 1726867572.91444: waiting for pending results... 30575 1726867572.91793: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867572.91910: in run() - task 0affcac9-a3a5-e081-a588-00000000020d 30575 1726867572.91950: variable 'ansible_search_path' from source: unknown 30575 1726867572.91959: variable 'ansible_search_path' from source: unknown 30575 1726867572.92001: calling self._execute() 30575 1726867572.92108: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867572.92120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867572.92249: variable 'omit' from source: magic vars 30575 1726867572.92513: variable 'ansible_distribution_major_version' from source: facts 30575 1726867572.92531: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867572.92542: variable 'omit' from source: magic vars 30575 1726867572.92613: variable 'omit' from source: magic vars 30575 1726867572.92720: variable 'network_provider' from source: set_fact 30575 1726867572.92743: variable 'omit' from source: magic vars 30575 1726867572.92792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867572.92834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867572.92859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867572.92884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867572.92910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867572.92949: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867572.92958: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867572.93013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867572.93079: Set connection var ansible_pipelining to False 30575 1726867572.93089: Set connection var ansible_shell_type to sh 30575 1726867572.93101: Set connection var ansible_shell_executable to /bin/sh 30575 1726867572.93117: Set connection var ansible_timeout to 10 30575 1726867572.93132: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867572.93144: Set connection var ansible_connection to ssh 30575 1726867572.93172: variable 'ansible_shell_executable' from source: unknown 30575 1726867572.93231: variable 'ansible_connection' from source: unknown 30575 1726867572.93235: variable 'ansible_module_compression' from source: unknown 30575 1726867572.93238: variable 'ansible_shell_type' from source: unknown 30575 1726867572.93240: variable 'ansible_shell_executable' from source: unknown 30575 1726867572.93242: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867572.93244: variable 'ansible_pipelining' from source: unknown 30575 1726867572.93246: variable 'ansible_timeout' from source: unknown 30575 1726867572.93249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867572.93372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867572.93390: variable 'omit' from source: magic vars 30575 1726867572.93401: starting attempt loop 30575 1726867572.93408: running the handler 30575 1726867572.93558: handler run complete 30575 1726867572.93562: attempt loop complete, returning result 30575 1726867572.93564: _execute() done 30575 1726867572.93567: dumping result to json 30575 1726867572.93569: done dumping result, returning 30575 1726867572.93571: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-00000000020d] 30575 1726867572.93573: sending task result for task 0affcac9-a3a5-e081-a588-00000000020d 30575 1726867572.93639: done sending task result for task 0affcac9-a3a5-e081-a588-00000000020d 30575 1726867572.93642: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867572.93722: no more pending results, returning what we have 30575 1726867572.93726: results queue empty 30575 1726867572.93727: checking for any_errors_fatal 30575 1726867572.93737: done checking for any_errors_fatal 30575 1726867572.93738: checking for max_fail_percentage 30575 1726867572.93740: done checking for max_fail_percentage 30575 1726867572.93741: checking to see if all hosts have failed and the running result is not ok 30575 1726867572.93742: done checking to see if all hosts have failed 30575 1726867572.93743: getting the remaining hosts for this loop 30575 1726867572.93747: done getting the remaining hosts for this loop 30575 1726867572.93755: getting the next task for host managed_node3 30575 1726867572.93764: done getting next task for host managed_node3 30575 1726867572.93770: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867572.93982: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867572.93993: getting variables 30575 1726867572.93995: in VariableManager get_vars() 30575 1726867572.94026: Calling all_inventory to load vars for managed_node3 30575 1726867572.94029: Calling groups_inventory to load vars for managed_node3 30575 1726867572.94031: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867572.94039: Calling all_plugins_play to load vars for managed_node3 30575 1726867572.94042: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867572.94045: Calling groups_plugins_play to load vars for managed_node3 30575 1726867572.96549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867572.99730: done with get_vars() 30575 1726867572.99756: done getting variables 30575 1726867573.00049: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:26:13 -0400 (0:00:00.090) 0:00:08.378 ****** 30575 1726867573.00092: entering _queue_task() for managed_node3/fail 30575 1726867573.00093: Creating lock for fail 30575 1726867573.00808: worker is 1 (out of 1 available) 30575 1726867573.00818: exiting _queue_task() for managed_node3/fail 30575 1726867573.00831: done queuing things up, now waiting for results queue to drain 30575 1726867573.00832: waiting for pending results... 30575 1726867573.01595: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867573.01624: in run() - task 0affcac9-a3a5-e081-a588-00000000020e 30575 1726867573.01644: variable 'ansible_search_path' from source: unknown 30575 1726867573.01652: variable 'ansible_search_path' from source: unknown 30575 1726867573.01733: calling self._execute() 30575 1726867573.01937: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.01946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.01962: variable 'omit' from source: magic vars 30575 1726867573.02631: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.02706: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.02937: variable 'network_state' from source: role '' defaults 30575 1726867573.02953: Evaluated conditional (network_state != {}): False 30575 1726867573.02960: when evaluation is False, skipping this task 30575 1726867573.02966: _execute() done 30575 1726867573.02973: dumping result to json 30575 1726867573.03023: done dumping result, returning 30575 1726867573.03035: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-00000000020e] 30575 1726867573.03044: sending task result for task 0affcac9-a3a5-e081-a588-00000000020e 30575 1726867573.03302: done sending task result for task 0affcac9-a3a5-e081-a588-00000000020e 30575 1726867573.03305: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867573.03355: no more pending results, returning what we have 30575 1726867573.03359: results queue empty 30575 1726867573.03360: checking for any_errors_fatal 30575 1726867573.03368: done checking for any_errors_fatal 30575 1726867573.03368: checking for max_fail_percentage 30575 1726867573.03370: done checking for max_fail_percentage 30575 1726867573.03371: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.03372: done checking to see if all hosts have failed 30575 1726867573.03373: getting the remaining hosts for this loop 30575 1726867573.03374: done getting the remaining hosts for this loop 30575 1726867573.03381: getting the next task for host managed_node3 30575 1726867573.03389: done getting next task for host managed_node3 30575 1726867573.03393: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867573.03398: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.03413: getting variables 30575 1726867573.03415: in VariableManager get_vars() 30575 1726867573.03449: Calling all_inventory to load vars for managed_node3 30575 1726867573.03452: Calling groups_inventory to load vars for managed_node3 30575 1726867573.03454: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.03466: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.03468: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.03471: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.06357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.09505: done with get_vars() 30575 1726867573.09527: done getting variables 30575 1726867573.09588: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:26:13 -0400 (0:00:00.095) 0:00:08.473 ****** 30575 1726867573.09624: entering _queue_task() for managed_node3/fail 30575 1726867573.10514: worker is 1 (out of 1 available) 30575 1726867573.10524: exiting _queue_task() for managed_node3/fail 30575 1726867573.10535: done queuing things up, now waiting for results queue to drain 30575 1726867573.10537: waiting for pending results... 30575 1726867573.10996: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867573.11000: in run() - task 0affcac9-a3a5-e081-a588-00000000020f 30575 1726867573.11004: variable 'ansible_search_path' from source: unknown 30575 1726867573.11006: variable 'ansible_search_path' from source: unknown 30575 1726867573.11185: calling self._execute() 30575 1726867573.11233: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.11273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.11301: variable 'omit' from source: magic vars 30575 1726867573.11708: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.11727: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.11857: variable 'network_state' from source: role '' defaults 30575 1726867573.11871: Evaluated conditional (network_state != {}): False 30575 1726867573.11882: when evaluation is False, skipping this task 30575 1726867573.11885: _execute() done 30575 1726867573.11890: dumping result to json 30575 1726867573.11898: done dumping result, returning 30575 1726867573.11909: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-00000000020f] 30575 1726867573.11918: sending task result for task 0affcac9-a3a5-e081-a588-00000000020f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867573.12060: no more pending results, returning what we have 30575 1726867573.12064: results queue empty 30575 1726867573.12064: checking for any_errors_fatal 30575 1726867573.12072: done checking for any_errors_fatal 30575 1726867573.12072: checking for max_fail_percentage 30575 1726867573.12074: done checking for max_fail_percentage 30575 1726867573.12075: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.12076: done checking to see if all hosts have failed 30575 1726867573.12076: getting the remaining hosts for this loop 30575 1726867573.12080: done getting the remaining hosts for this loop 30575 1726867573.12084: getting the next task for host managed_node3 30575 1726867573.12092: done getting next task for host managed_node3 30575 1726867573.12096: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867573.12101: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.12115: getting variables 30575 1726867573.12116: in VariableManager get_vars() 30575 1726867573.12151: Calling all_inventory to load vars for managed_node3 30575 1726867573.12153: Calling groups_inventory to load vars for managed_node3 30575 1726867573.12155: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.12166: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.12168: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.12171: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.12814: done sending task result for task 0affcac9-a3a5-e081-a588-00000000020f 30575 1726867573.12817: WORKER PROCESS EXITING 30575 1726867573.13562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.15904: done with get_vars() 30575 1726867573.15927: done getting variables 30575 1726867573.15982: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:26:13 -0400 (0:00:00.063) 0:00:08.537 ****** 30575 1726867573.16016: entering _queue_task() for managed_node3/fail 30575 1726867573.16297: worker is 1 (out of 1 available) 30575 1726867573.16309: exiting _queue_task() for managed_node3/fail 30575 1726867573.16322: done queuing things up, now waiting for results queue to drain 30575 1726867573.16326: waiting for pending results... 30575 1726867573.16705: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867573.16741: in run() - task 0affcac9-a3a5-e081-a588-000000000210 30575 1726867573.16760: variable 'ansible_search_path' from source: unknown 30575 1726867573.16769: variable 'ansible_search_path' from source: unknown 30575 1726867573.16840: calling self._execute() 30575 1726867573.17036: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.17284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.17288: variable 'omit' from source: magic vars 30575 1726867573.17554: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.17569: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.17755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867573.20801: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867573.20887: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867573.20930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867573.20998: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867573.21036: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867573.21171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.21265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.21300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.21350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.21369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.21464: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.21485: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867573.21607: variable 'ansible_distribution' from source: facts 30575 1726867573.21616: variable '__network_rh_distros' from source: role '' defaults 30575 1726867573.21637: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867573.21899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.21961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.21965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.22012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.22033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.22183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.22189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.22192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.22194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.22209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.22270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.22310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.22419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.22882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.22885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.23301: variable 'network_connections' from source: include params 30575 1726867573.23314: variable 'interface' from source: play vars 30575 1726867573.23382: variable 'interface' from source: play vars 30575 1726867573.23521: variable 'network_state' from source: role '' defaults 30575 1726867573.23598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867573.23981: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867573.24031: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867573.24086: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867573.24121: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867573.24174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867573.24208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867573.24250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.24283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867573.24329: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867573.24408: when evaluation is False, skipping this task 30575 1726867573.24411: _execute() done 30575 1726867573.24414: dumping result to json 30575 1726867573.24416: done dumping result, returning 30575 1726867573.24418: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000000210] 30575 1726867573.24421: sending task result for task 0affcac9-a3a5-e081-a588-000000000210 30575 1726867573.24495: done sending task result for task 0affcac9-a3a5-e081-a588-000000000210 30575 1726867573.24498: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867573.24561: no more pending results, returning what we have 30575 1726867573.24565: results queue empty 30575 1726867573.24566: checking for any_errors_fatal 30575 1726867573.24573: done checking for any_errors_fatal 30575 1726867573.24573: checking for max_fail_percentage 30575 1726867573.24575: done checking for max_fail_percentage 30575 1726867573.24576: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.24579: done checking to see if all hosts have failed 30575 1726867573.24580: getting the remaining hosts for this loop 30575 1726867573.24582: done getting the remaining hosts for this loop 30575 1726867573.24587: getting the next task for host managed_node3 30575 1726867573.24595: done getting next task for host managed_node3 30575 1726867573.24600: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867573.24606: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.24621: getting variables 30575 1726867573.24625: in VariableManager get_vars() 30575 1726867573.24665: Calling all_inventory to load vars for managed_node3 30575 1726867573.24668: Calling groups_inventory to load vars for managed_node3 30575 1726867573.24670: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.24784: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.24788: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.24793: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.26683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.27729: done with get_vars() 30575 1726867573.27747: done getting variables 30575 1726867573.27820: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:26:13 -0400 (0:00:00.118) 0:00:08.656 ****** 30575 1726867573.27846: entering _queue_task() for managed_node3/dnf 30575 1726867573.28085: worker is 1 (out of 1 available) 30575 1726867573.28097: exiting _queue_task() for managed_node3/dnf 30575 1726867573.28111: done queuing things up, now waiting for results queue to drain 30575 1726867573.28112: waiting for pending results... 30575 1726867573.28394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867573.28461: in run() - task 0affcac9-a3a5-e081-a588-000000000211 30575 1726867573.28488: variable 'ansible_search_path' from source: unknown 30575 1726867573.28502: variable 'ansible_search_path' from source: unknown 30575 1726867573.28683: calling self._execute() 30575 1726867573.28686: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.28689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.28692: variable 'omit' from source: magic vars 30575 1726867573.29015: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.29041: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.29241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867573.30739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867573.30792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867573.30818: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867573.30844: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867573.30864: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867573.30925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.30944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.30962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.30995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.31003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.31079: variable 'ansible_distribution' from source: facts 30575 1726867573.31083: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.31097: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867573.31168: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867573.31269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.31293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.31481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.31484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.31487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.31489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.31491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.31492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.31510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.31530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.31574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.31604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.31630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.31669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.31689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.31836: variable 'network_connections' from source: include params 30575 1726867573.31854: variable 'interface' from source: play vars 30575 1726867573.31934: variable 'interface' from source: play vars 30575 1726867573.32010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867573.32196: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867573.32241: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867573.32285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867573.32319: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867573.32374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867573.32404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867573.32444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.32485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867573.32545: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867573.33051: variable 'network_connections' from source: include params 30575 1726867573.33055: variable 'interface' from source: play vars 30575 1726867573.33100: variable 'interface' from source: play vars 30575 1726867573.33128: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867573.33132: when evaluation is False, skipping this task 30575 1726867573.33134: _execute() done 30575 1726867573.33137: dumping result to json 30575 1726867573.33139: done dumping result, returning 30575 1726867573.33146: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000211] 30575 1726867573.33151: sending task result for task 0affcac9-a3a5-e081-a588-000000000211 30575 1726867573.33241: done sending task result for task 0affcac9-a3a5-e081-a588-000000000211 30575 1726867573.33243: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867573.33290: no more pending results, returning what we have 30575 1726867573.33293: results queue empty 30575 1726867573.33294: checking for any_errors_fatal 30575 1726867573.33299: done checking for any_errors_fatal 30575 1726867573.33300: checking for max_fail_percentage 30575 1726867573.33301: done checking for max_fail_percentage 30575 1726867573.33302: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.33303: done checking to see if all hosts have failed 30575 1726867573.33304: getting the remaining hosts for this loop 30575 1726867573.33305: done getting the remaining hosts for this loop 30575 1726867573.33309: getting the next task for host managed_node3 30575 1726867573.33316: done getting next task for host managed_node3 30575 1726867573.33319: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867573.33324: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.33339: getting variables 30575 1726867573.33341: in VariableManager get_vars() 30575 1726867573.33375: Calling all_inventory to load vars for managed_node3 30575 1726867573.33385: Calling groups_inventory to load vars for managed_node3 30575 1726867573.33388: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.33397: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.33399: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.33402: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.34258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.35683: done with get_vars() 30575 1726867573.35703: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867573.35766: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:26:13 -0400 (0:00:00.079) 0:00:08.735 ****** 30575 1726867573.35801: entering _queue_task() for managed_node3/yum 30575 1726867573.35803: Creating lock for yum 30575 1726867573.36094: worker is 1 (out of 1 available) 30575 1726867573.36107: exiting _queue_task() for managed_node3/yum 30575 1726867573.36119: done queuing things up, now waiting for results queue to drain 30575 1726867573.36121: waiting for pending results... 30575 1726867573.36499: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867573.36509: in run() - task 0affcac9-a3a5-e081-a588-000000000212 30575 1726867573.36513: variable 'ansible_search_path' from source: unknown 30575 1726867573.36516: variable 'ansible_search_path' from source: unknown 30575 1726867573.36541: calling self._execute() 30575 1726867573.36626: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.36637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.36650: variable 'omit' from source: magic vars 30575 1726867573.37033: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.37043: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.37163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867573.38614: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867573.38666: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867573.38782: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867573.38785: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867573.38787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867573.38849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.38885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.38915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.38965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.38989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.39098: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.39118: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867573.39137: when evaluation is False, skipping this task 30575 1726867573.39240: _execute() done 30575 1726867573.39243: dumping result to json 30575 1726867573.39246: done dumping result, returning 30575 1726867573.39249: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000212] 30575 1726867573.39251: sending task result for task 0affcac9-a3a5-e081-a588-000000000212 30575 1726867573.39325: done sending task result for task 0affcac9-a3a5-e081-a588-000000000212 30575 1726867573.39328: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867573.39384: no more pending results, returning what we have 30575 1726867573.39388: results queue empty 30575 1726867573.39389: checking for any_errors_fatal 30575 1726867573.39394: done checking for any_errors_fatal 30575 1726867573.39395: checking for max_fail_percentage 30575 1726867573.39397: done checking for max_fail_percentage 30575 1726867573.39398: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.39399: done checking to see if all hosts have failed 30575 1726867573.39399: getting the remaining hosts for this loop 30575 1726867573.39401: done getting the remaining hosts for this loop 30575 1726867573.39405: getting the next task for host managed_node3 30575 1726867573.39413: done getting next task for host managed_node3 30575 1726867573.39418: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867573.39426: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.39443: getting variables 30575 1726867573.39445: in VariableManager get_vars() 30575 1726867573.39486: Calling all_inventory to load vars for managed_node3 30575 1726867573.39489: Calling groups_inventory to load vars for managed_node3 30575 1726867573.39491: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.39501: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.39504: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.39507: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.40465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.41319: done with get_vars() 30575 1726867573.41334: done getting variables 30575 1726867573.41373: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:26:13 -0400 (0:00:00.055) 0:00:08.791 ****** 30575 1726867573.41399: entering _queue_task() for managed_node3/fail 30575 1726867573.41609: worker is 1 (out of 1 available) 30575 1726867573.41622: exiting _queue_task() for managed_node3/fail 30575 1726867573.41636: done queuing things up, now waiting for results queue to drain 30575 1726867573.41637: waiting for pending results... 30575 1726867573.41803: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867573.41881: in run() - task 0affcac9-a3a5-e081-a588-000000000213 30575 1726867573.41893: variable 'ansible_search_path' from source: unknown 30575 1726867573.41897: variable 'ansible_search_path' from source: unknown 30575 1726867573.41924: calling self._execute() 30575 1726867573.41990: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.41994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.42004: variable 'omit' from source: magic vars 30575 1726867573.42251: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.42260: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.42341: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867573.42465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867573.44138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867573.44184: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867573.44210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867573.44245: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867573.44268: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867573.44325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.44344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.44361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.44393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.44404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.44436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.44451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.44467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.44498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.44508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.44536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.44552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.44567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.44597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.44607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.44711: variable 'network_connections' from source: include params 30575 1726867573.44720: variable 'interface' from source: play vars 30575 1726867573.44768: variable 'interface' from source: play vars 30575 1726867573.44820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867573.44982: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867573.44986: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867573.44988: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867573.44990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867573.45188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867573.45191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867573.45194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.45196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867573.45198: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867573.45461: variable 'network_connections' from source: include params 30575 1726867573.45470: variable 'interface' from source: play vars 30575 1726867573.45544: variable 'interface' from source: play vars 30575 1726867573.45580: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867573.45590: when evaluation is False, skipping this task 30575 1726867573.45597: _execute() done 30575 1726867573.45604: dumping result to json 30575 1726867573.45611: done dumping result, returning 30575 1726867573.45632: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000213] 30575 1726867573.45641: sending task result for task 0affcac9-a3a5-e081-a588-000000000213 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867573.45890: no more pending results, returning what we have 30575 1726867573.45894: results queue empty 30575 1726867573.45895: checking for any_errors_fatal 30575 1726867573.45900: done checking for any_errors_fatal 30575 1726867573.45901: checking for max_fail_percentage 30575 1726867573.45903: done checking for max_fail_percentage 30575 1726867573.45904: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.45905: done checking to see if all hosts have failed 30575 1726867573.45905: getting the remaining hosts for this loop 30575 1726867573.45907: done getting the remaining hosts for this loop 30575 1726867573.45911: getting the next task for host managed_node3 30575 1726867573.45919: done getting next task for host managed_node3 30575 1726867573.45922: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867573.45930: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.45945: getting variables 30575 1726867573.45947: in VariableManager get_vars() 30575 1726867573.46046: Calling all_inventory to load vars for managed_node3 30575 1726867573.46048: Calling groups_inventory to load vars for managed_node3 30575 1726867573.46050: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.46060: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.46063: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.46065: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.46979: done sending task result for task 0affcac9-a3a5-e081-a588-000000000213 30575 1726867573.46983: WORKER PROCESS EXITING 30575 1726867573.46993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.47837: done with get_vars() 30575 1726867573.47852: done getting variables 30575 1726867573.47894: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:26:13 -0400 (0:00:00.065) 0:00:08.856 ****** 30575 1726867573.47917: entering _queue_task() for managed_node3/package 30575 1726867573.48230: worker is 1 (out of 1 available) 30575 1726867573.48243: exiting _queue_task() for managed_node3/package 30575 1726867573.48256: done queuing things up, now waiting for results queue to drain 30575 1726867573.48257: waiting for pending results... 30575 1726867573.48907: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867573.48973: in run() - task 0affcac9-a3a5-e081-a588-000000000214 30575 1726867573.48995: variable 'ansible_search_path' from source: unknown 30575 1726867573.49011: variable 'ansible_search_path' from source: unknown 30575 1726867573.49052: calling self._execute() 30575 1726867573.49143: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.49154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.49168: variable 'omit' from source: magic vars 30575 1726867573.49532: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.49558: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.49753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867573.50035: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867573.50097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867573.50182: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867573.50185: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867573.50283: variable 'network_packages' from source: role '' defaults 30575 1726867573.50400: variable '__network_provider_setup' from source: role '' defaults 30575 1726867573.50420: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867573.50496: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867573.50510: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867573.50583: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867573.50784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867573.52846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867573.52948: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867573.52980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867573.53044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867573.53082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867573.53171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.53263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.53266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.53296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.53315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.53373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.53411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.53451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.53517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.53541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.53804: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867573.54020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.54027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.54030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.54032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.54145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.54353: variable 'ansible_python' from source: facts 30575 1726867573.54385: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867573.54483: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867573.54584: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867573.54786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.54790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.54793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.54831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.54852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.54915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.55003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.55006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.55038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.55058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.55230: variable 'network_connections' from source: include params 30575 1726867573.55246: variable 'interface' from source: play vars 30575 1726867573.55364: variable 'interface' from source: play vars 30575 1726867573.55473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867573.55551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867573.55555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.55593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867573.55656: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867573.56038: variable 'network_connections' from source: include params 30575 1726867573.56048: variable 'interface' from source: play vars 30575 1726867573.56212: variable 'interface' from source: play vars 30575 1726867573.56272: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867573.56434: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867573.56934: variable 'network_connections' from source: include params 30575 1726867573.57095: variable 'interface' from source: play vars 30575 1726867573.57099: variable 'interface' from source: play vars 30575 1726867573.57201: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867573.57487: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867573.58058: variable 'network_connections' from source: include params 30575 1726867573.58074: variable 'interface' from source: play vars 30575 1726867573.58166: variable 'interface' from source: play vars 30575 1726867573.58232: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867573.58291: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867573.58303: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867573.58360: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867573.58579: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867573.59088: variable 'network_connections' from source: include params 30575 1726867573.59091: variable 'interface' from source: play vars 30575 1726867573.59150: variable 'interface' from source: play vars 30575 1726867573.59160: variable 'ansible_distribution' from source: facts 30575 1726867573.59163: variable '__network_rh_distros' from source: role '' defaults 30575 1726867573.59168: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.59206: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867573.59370: variable 'ansible_distribution' from source: facts 30575 1726867573.59373: variable '__network_rh_distros' from source: role '' defaults 30575 1726867573.59381: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.59397: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867573.59559: variable 'ansible_distribution' from source: facts 30575 1726867573.59563: variable '__network_rh_distros' from source: role '' defaults 30575 1726867573.59565: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.59605: variable 'network_provider' from source: set_fact 30575 1726867573.59629: variable 'ansible_facts' from source: unknown 30575 1726867573.60530: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867573.60535: when evaluation is False, skipping this task 30575 1726867573.60537: _execute() done 30575 1726867573.60540: dumping result to json 30575 1726867573.60542: done dumping result, returning 30575 1726867573.60551: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000000214] 30575 1726867573.60556: sending task result for task 0affcac9-a3a5-e081-a588-000000000214 30575 1726867573.60647: done sending task result for task 0affcac9-a3a5-e081-a588-000000000214 30575 1726867573.60650: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867573.60812: no more pending results, returning what we have 30575 1726867573.60816: results queue empty 30575 1726867573.60817: checking for any_errors_fatal 30575 1726867573.60822: done checking for any_errors_fatal 30575 1726867573.60825: checking for max_fail_percentage 30575 1726867573.60826: done checking for max_fail_percentage 30575 1726867573.60827: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.60828: done checking to see if all hosts have failed 30575 1726867573.60829: getting the remaining hosts for this loop 30575 1726867573.60830: done getting the remaining hosts for this loop 30575 1726867573.60834: getting the next task for host managed_node3 30575 1726867573.60839: done getting next task for host managed_node3 30575 1726867573.60843: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867573.60848: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.60859: getting variables 30575 1726867573.60860: in VariableManager get_vars() 30575 1726867573.60900: Calling all_inventory to load vars for managed_node3 30575 1726867573.60902: Calling groups_inventory to load vars for managed_node3 30575 1726867573.60904: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.60912: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.60914: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.60916: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.61872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.62826: done with get_vars() 30575 1726867573.62841: done getting variables 30575 1726867573.62887: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:26:13 -0400 (0:00:00.149) 0:00:09.006 ****** 30575 1726867573.62913: entering _queue_task() for managed_node3/package 30575 1726867573.63140: worker is 1 (out of 1 available) 30575 1726867573.63155: exiting _queue_task() for managed_node3/package 30575 1726867573.63168: done queuing things up, now waiting for results queue to drain 30575 1726867573.63170: waiting for pending results... 30575 1726867573.63374: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867573.63683: in run() - task 0affcac9-a3a5-e081-a588-000000000215 30575 1726867573.63687: variable 'ansible_search_path' from source: unknown 30575 1726867573.63689: variable 'ansible_search_path' from source: unknown 30575 1726867573.63692: calling self._execute() 30575 1726867573.63694: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.63696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.63699: variable 'omit' from source: magic vars 30575 1726867573.64027: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.64048: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.64167: variable 'network_state' from source: role '' defaults 30575 1726867573.64185: Evaluated conditional (network_state != {}): False 30575 1726867573.64192: when evaluation is False, skipping this task 30575 1726867573.64199: _execute() done 30575 1726867573.64205: dumping result to json 30575 1726867573.64212: done dumping result, returning 30575 1726867573.64223: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000215] 30575 1726867573.64234: sending task result for task 0affcac9-a3a5-e081-a588-000000000215 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867573.64402: no more pending results, returning what we have 30575 1726867573.64407: results queue empty 30575 1726867573.64407: checking for any_errors_fatal 30575 1726867573.64412: done checking for any_errors_fatal 30575 1726867573.64413: checking for max_fail_percentage 30575 1726867573.64414: done checking for max_fail_percentage 30575 1726867573.64415: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.64416: done checking to see if all hosts have failed 30575 1726867573.64417: getting the remaining hosts for this loop 30575 1726867573.64418: done getting the remaining hosts for this loop 30575 1726867573.64422: getting the next task for host managed_node3 30575 1726867573.64431: done getting next task for host managed_node3 30575 1726867573.64436: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867573.64442: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.64456: getting variables 30575 1726867573.64458: in VariableManager get_vars() 30575 1726867573.64498: Calling all_inventory to load vars for managed_node3 30575 1726867573.64501: Calling groups_inventory to load vars for managed_node3 30575 1726867573.64503: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.64516: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.64519: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.64522: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.67787: done sending task result for task 0affcac9-a3a5-e081-a588-000000000215 30575 1726867573.67790: WORKER PROCESS EXITING 30575 1726867573.68467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.69311: done with get_vars() 30575 1726867573.69326: done getting variables 30575 1726867573.69359: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:26:13 -0400 (0:00:00.064) 0:00:09.071 ****** 30575 1726867573.69380: entering _queue_task() for managed_node3/package 30575 1726867573.69600: worker is 1 (out of 1 available) 30575 1726867573.69611: exiting _queue_task() for managed_node3/package 30575 1726867573.69626: done queuing things up, now waiting for results queue to drain 30575 1726867573.69628: waiting for pending results... 30575 1726867573.69794: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867573.69885: in run() - task 0affcac9-a3a5-e081-a588-000000000216 30575 1726867573.69895: variable 'ansible_search_path' from source: unknown 30575 1726867573.69898: variable 'ansible_search_path' from source: unknown 30575 1726867573.69928: calling self._execute() 30575 1726867573.70186: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.70191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.70194: variable 'omit' from source: magic vars 30575 1726867573.70441: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.70457: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.70590: variable 'network_state' from source: role '' defaults 30575 1726867573.70605: Evaluated conditional (network_state != {}): False 30575 1726867573.70615: when evaluation is False, skipping this task 30575 1726867573.70641: _execute() done 30575 1726867573.70686: dumping result to json 30575 1726867573.70690: done dumping result, returning 30575 1726867573.70693: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000216] 30575 1726867573.70696: sending task result for task 0affcac9-a3a5-e081-a588-000000000216 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867573.70960: no more pending results, returning what we have 30575 1726867573.70964: results queue empty 30575 1726867573.70964: checking for any_errors_fatal 30575 1726867573.70972: done checking for any_errors_fatal 30575 1726867573.70973: checking for max_fail_percentage 30575 1726867573.70979: done checking for max_fail_percentage 30575 1726867573.70980: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.70981: done checking to see if all hosts have failed 30575 1726867573.70982: getting the remaining hosts for this loop 30575 1726867573.70983: done getting the remaining hosts for this loop 30575 1726867573.70987: getting the next task for host managed_node3 30575 1726867573.70994: done getting next task for host managed_node3 30575 1726867573.70998: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867573.71002: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.71015: getting variables 30575 1726867573.71017: in VariableManager get_vars() 30575 1726867573.71048: Calling all_inventory to load vars for managed_node3 30575 1726867573.71050: Calling groups_inventory to load vars for managed_node3 30575 1726867573.71052: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.71060: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.71062: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.71065: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.71819: done sending task result for task 0affcac9-a3a5-e081-a588-000000000216 30575 1726867573.71823: WORKER PROCESS EXITING 30575 1726867573.71834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.72771: done with get_vars() 30575 1726867573.72787: done getting variables 30575 1726867573.72856: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:26:13 -0400 (0:00:00.034) 0:00:09.106 ****** 30575 1726867573.72879: entering _queue_task() for managed_node3/service 30575 1726867573.72881: Creating lock for service 30575 1726867573.73076: worker is 1 (out of 1 available) 30575 1726867573.73092: exiting _queue_task() for managed_node3/service 30575 1726867573.73105: done queuing things up, now waiting for results queue to drain 30575 1726867573.73106: waiting for pending results... 30575 1726867573.73257: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867573.73339: in run() - task 0affcac9-a3a5-e081-a588-000000000217 30575 1726867573.73349: variable 'ansible_search_path' from source: unknown 30575 1726867573.73353: variable 'ansible_search_path' from source: unknown 30575 1726867573.73381: calling self._execute() 30575 1726867573.73444: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.73448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.73450: variable 'omit' from source: magic vars 30575 1726867573.73696: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.73705: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.73787: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867573.73909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867573.75315: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867573.75366: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867573.75403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867573.75429: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867573.75448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867573.75505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.75530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.75546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.75572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.75584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.75619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.75637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.75654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.75681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.75693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.75720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.75739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.75756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.75781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.75792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.75901: variable 'network_connections' from source: include params 30575 1726867573.75910: variable 'interface' from source: play vars 30575 1726867573.75961: variable 'interface' from source: play vars 30575 1726867573.76009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867573.76129: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867573.76154: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867573.76180: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867573.76202: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867573.76233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867573.76248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867573.76267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.76289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867573.76332: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867573.76480: variable 'network_connections' from source: include params 30575 1726867573.76484: variable 'interface' from source: play vars 30575 1726867573.76529: variable 'interface' from source: play vars 30575 1726867573.76551: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867573.76555: when evaluation is False, skipping this task 30575 1726867573.76557: _execute() done 30575 1726867573.76560: dumping result to json 30575 1726867573.76562: done dumping result, returning 30575 1726867573.76569: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000217] 30575 1726867573.76573: sending task result for task 0affcac9-a3a5-e081-a588-000000000217 30575 1726867573.76652: done sending task result for task 0affcac9-a3a5-e081-a588-000000000217 30575 1726867573.76660: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867573.76728: no more pending results, returning what we have 30575 1726867573.76730: results queue empty 30575 1726867573.76731: checking for any_errors_fatal 30575 1726867573.76736: done checking for any_errors_fatal 30575 1726867573.76737: checking for max_fail_percentage 30575 1726867573.76739: done checking for max_fail_percentage 30575 1726867573.76739: checking to see if all hosts have failed and the running result is not ok 30575 1726867573.76740: done checking to see if all hosts have failed 30575 1726867573.76741: getting the remaining hosts for this loop 30575 1726867573.76742: done getting the remaining hosts for this loop 30575 1726867573.76745: getting the next task for host managed_node3 30575 1726867573.76751: done getting next task for host managed_node3 30575 1726867573.76754: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867573.76758: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867573.76778: getting variables 30575 1726867573.76780: in VariableManager get_vars() 30575 1726867573.76808: Calling all_inventory to load vars for managed_node3 30575 1726867573.76810: Calling groups_inventory to load vars for managed_node3 30575 1726867573.76812: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867573.76819: Calling all_plugins_play to load vars for managed_node3 30575 1726867573.76821: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867573.76827: Calling groups_plugins_play to load vars for managed_node3 30575 1726867573.77546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867573.78406: done with get_vars() 30575 1726867573.78419: done getting variables 30575 1726867573.78457: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:26:13 -0400 (0:00:00.055) 0:00:09.162 ****** 30575 1726867573.78478: entering _queue_task() for managed_node3/service 30575 1726867573.78665: worker is 1 (out of 1 available) 30575 1726867573.78680: exiting _queue_task() for managed_node3/service 30575 1726867573.78691: done queuing things up, now waiting for results queue to drain 30575 1726867573.78692: waiting for pending results... 30575 1726867573.78847: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867573.78925: in run() - task 0affcac9-a3a5-e081-a588-000000000218 30575 1726867573.78935: variable 'ansible_search_path' from source: unknown 30575 1726867573.78939: variable 'ansible_search_path' from source: unknown 30575 1726867573.78965: calling self._execute() 30575 1726867573.79035: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.79039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.79047: variable 'omit' from source: magic vars 30575 1726867573.79313: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.79322: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867573.79435: variable 'network_provider' from source: set_fact 30575 1726867573.79439: variable 'network_state' from source: role '' defaults 30575 1726867573.79447: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867573.79452: variable 'omit' from source: magic vars 30575 1726867573.79492: variable 'omit' from source: magic vars 30575 1726867573.79514: variable 'network_service_name' from source: role '' defaults 30575 1726867573.79564: variable 'network_service_name' from source: role '' defaults 30575 1726867573.79636: variable '__network_provider_setup' from source: role '' defaults 30575 1726867573.79640: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867573.79686: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867573.79694: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867573.79741: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867573.79888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867573.81273: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867573.81550: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867573.81575: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867573.81607: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867573.81630: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867573.81689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.81707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.81727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.81753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.81764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.81797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.81813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.81831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.81856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.81867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.82002: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867573.82074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.82092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.82108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.82137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.82147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.82208: variable 'ansible_python' from source: facts 30575 1726867573.82219: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867573.82276: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867573.82329: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867573.82412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.82430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.82450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.82473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.82486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.82519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867573.82539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867573.82555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.82585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867573.82595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867573.82693: variable 'network_connections' from source: include params 30575 1726867573.82698: variable 'interface' from source: play vars 30575 1726867573.82751: variable 'interface' from source: play vars 30575 1726867573.82823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867573.82939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867573.82988: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867573.83018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867573.83049: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867573.83092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867573.83115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867573.83140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867573.83163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867573.83199: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867573.83373: variable 'network_connections' from source: include params 30575 1726867573.83379: variable 'interface' from source: play vars 30575 1726867573.83433: variable 'interface' from source: play vars 30575 1726867573.83463: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867573.83517: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867573.83703: variable 'network_connections' from source: include params 30575 1726867573.83706: variable 'interface' from source: play vars 30575 1726867573.83762: variable 'interface' from source: play vars 30575 1726867573.83776: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867573.83831: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867573.84012: variable 'network_connections' from source: include params 30575 1726867573.84015: variable 'interface' from source: play vars 30575 1726867573.84067: variable 'interface' from source: play vars 30575 1726867573.84110: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867573.84154: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867573.84159: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867573.84204: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867573.84336: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867573.84641: variable 'network_connections' from source: include params 30575 1726867573.84644: variable 'interface' from source: play vars 30575 1726867573.84688: variable 'interface' from source: play vars 30575 1726867573.84695: variable 'ansible_distribution' from source: facts 30575 1726867573.84698: variable '__network_rh_distros' from source: role '' defaults 30575 1726867573.84703: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.84724: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867573.84840: variable 'ansible_distribution' from source: facts 30575 1726867573.84843: variable '__network_rh_distros' from source: role '' defaults 30575 1726867573.84846: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.84854: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867573.84962: variable 'ansible_distribution' from source: facts 30575 1726867573.84970: variable '__network_rh_distros' from source: role '' defaults 30575 1726867573.84975: variable 'ansible_distribution_major_version' from source: facts 30575 1726867573.85003: variable 'network_provider' from source: set_fact 30575 1726867573.85018: variable 'omit' from source: magic vars 30575 1726867573.85037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867573.85055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867573.85074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867573.85094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867573.85097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867573.85118: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867573.85121: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.85126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.85188: Set connection var ansible_pipelining to False 30575 1726867573.85192: Set connection var ansible_shell_type to sh 30575 1726867573.85197: Set connection var ansible_shell_executable to /bin/sh 30575 1726867573.85202: Set connection var ansible_timeout to 10 30575 1726867573.85207: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867573.85215: Set connection var ansible_connection to ssh 30575 1726867573.85232: variable 'ansible_shell_executable' from source: unknown 30575 1726867573.85235: variable 'ansible_connection' from source: unknown 30575 1726867573.85237: variable 'ansible_module_compression' from source: unknown 30575 1726867573.85239: variable 'ansible_shell_type' from source: unknown 30575 1726867573.85242: variable 'ansible_shell_executable' from source: unknown 30575 1726867573.85245: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867573.85249: variable 'ansible_pipelining' from source: unknown 30575 1726867573.85251: variable 'ansible_timeout' from source: unknown 30575 1726867573.85255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867573.85327: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867573.85334: variable 'omit' from source: magic vars 30575 1726867573.85337: starting attempt loop 30575 1726867573.85340: running the handler 30575 1726867573.85394: variable 'ansible_facts' from source: unknown 30575 1726867573.85763: _low_level_execute_command(): starting 30575 1726867573.85769: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867573.86268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867573.86272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867573.86275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867573.86278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867573.86281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867573.86333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867573.86336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867573.86338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867573.86402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867573.88095: stdout chunk (state=3): >>>/root <<< 30575 1726867573.88191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867573.88222: stderr chunk (state=3): >>><<< 30575 1726867573.88225: stdout chunk (state=3): >>><<< 30575 1726867573.88241: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867573.88250: _low_level_execute_command(): starting 30575 1726867573.88257: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228 `" && echo ansible-tmp-1726867573.8824062-30990-110230600292228="` echo /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228 `" ) && sleep 0' 30575 1726867573.88676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867573.88681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867573.88684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867573.88686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867573.88688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867573.88737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867573.88740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867573.88792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867573.90663: stdout chunk (state=3): >>>ansible-tmp-1726867573.8824062-30990-110230600292228=/root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228 <<< 30575 1726867573.90772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867573.90796: stderr chunk (state=3): >>><<< 30575 1726867573.90799: stdout chunk (state=3): >>><<< 30575 1726867573.90812: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867573.8824062-30990-110230600292228=/root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867573.90837: variable 'ansible_module_compression' from source: unknown 30575 1726867573.90881: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 30575 1726867573.90885: ANSIBALLZ: Acquiring lock 30575 1726867573.90888: ANSIBALLZ: Lock acquired: 140240646918832 30575 1726867573.90890: ANSIBALLZ: Creating module 30575 1726867574.09904: ANSIBALLZ: Writing module into payload 30575 1726867574.10056: ANSIBALLZ: Writing module 30575 1726867574.10114: ANSIBALLZ: Renaming module 30575 1726867574.10118: ANSIBALLZ: Done creating module 30575 1726867574.10129: variable 'ansible_facts' from source: unknown 30575 1726867574.10441: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/AnsiballZ_systemd.py 30575 1726867574.10563: Sending initial data 30575 1726867574.10571: Sent initial data (156 bytes) 30575 1726867574.11084: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867574.11093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867574.11099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867574.11102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867574.11205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867574.11212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867574.11223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867574.11235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867574.11311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867574.13154: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867574.13180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867574.13230: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkdsjxz8i /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/AnsiballZ_systemd.py <<< 30575 1726867574.13233: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/AnsiballZ_systemd.py" <<< 30575 1726867574.13411: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkdsjxz8i" to remote "/root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/AnsiballZ_systemd.py" <<< 30575 1726867574.13414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/AnsiballZ_systemd.py" <<< 30575 1726867574.16582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867574.16891: stderr chunk (state=3): >>><<< 30575 1726867574.16894: stdout chunk (state=3): >>><<< 30575 1726867574.16912: done transferring module to remote 30575 1726867574.16923: _low_level_execute_command(): starting 30575 1726867574.16932: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/ /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/AnsiballZ_systemd.py && sleep 0' 30575 1726867574.18149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867574.18153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867574.18155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867574.18157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867574.18159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867574.18392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867574.20164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867574.20301: stderr chunk (state=3): >>><<< 30575 1726867574.20304: stdout chunk (state=3): >>><<< 30575 1726867574.20320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867574.20323: _low_level_execute_command(): starting 30575 1726867574.20332: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/AnsiballZ_systemd.py && sleep 0' 30575 1726867574.21454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867574.21460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867574.21479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867574.21483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867574.21684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867574.21704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867574.21775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867574.50500: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10477568", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3336511488", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1742368000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867574.52325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867574.52348: stderr chunk (state=3): >>><<< 30575 1726867574.52351: stdout chunk (state=3): >>><<< 30575 1726867574.52369: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10477568", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3336511488", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1742368000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867574.52488: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867574.52503: _low_level_execute_command(): starting 30575 1726867574.52506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867573.8824062-30990-110230600292228/ > /dev/null 2>&1 && sleep 0' 30575 1726867574.52921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867574.52925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867574.52927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867574.52929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867574.52983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867574.52990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867574.52992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867574.53032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867574.54863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867574.54866: stderr chunk (state=3): >>><<< 30575 1726867574.54869: stdout chunk (state=3): >>><<< 30575 1726867574.54887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867574.54890: handler run complete 30575 1726867574.54928: attempt loop complete, returning result 30575 1726867574.54931: _execute() done 30575 1726867574.54933: dumping result to json 30575 1726867574.54947: done dumping result, returning 30575 1726867574.54955: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000000218] 30575 1726867574.54961: sending task result for task 0affcac9-a3a5-e081-a588-000000000218 30575 1726867574.55180: done sending task result for task 0affcac9-a3a5-e081-a588-000000000218 30575 1726867574.55183: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867574.55230: no more pending results, returning what we have 30575 1726867574.55233: results queue empty 30575 1726867574.55234: checking for any_errors_fatal 30575 1726867574.55238: done checking for any_errors_fatal 30575 1726867574.55239: checking for max_fail_percentage 30575 1726867574.55240: done checking for max_fail_percentage 30575 1726867574.55241: checking to see if all hosts have failed and the running result is not ok 30575 1726867574.55242: done checking to see if all hosts have failed 30575 1726867574.55243: getting the remaining hosts for this loop 30575 1726867574.55244: done getting the remaining hosts for this loop 30575 1726867574.55247: getting the next task for host managed_node3 30575 1726867574.55253: done getting next task for host managed_node3 30575 1726867574.55256: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867574.55261: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867574.55271: getting variables 30575 1726867574.55272: in VariableManager get_vars() 30575 1726867574.55308: Calling all_inventory to load vars for managed_node3 30575 1726867574.55311: Calling groups_inventory to load vars for managed_node3 30575 1726867574.55312: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867574.55321: Calling all_plugins_play to load vars for managed_node3 30575 1726867574.55325: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867574.55328: Calling groups_plugins_play to load vars for managed_node3 30575 1726867574.56221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867574.57070: done with get_vars() 30575 1726867574.57086: done getting variables 30575 1726867574.57129: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:26:14 -0400 (0:00:00.786) 0:00:09.949 ****** 30575 1726867574.57156: entering _queue_task() for managed_node3/service 30575 1726867574.57363: worker is 1 (out of 1 available) 30575 1726867574.57375: exiting _queue_task() for managed_node3/service 30575 1726867574.57388: done queuing things up, now waiting for results queue to drain 30575 1726867574.57390: waiting for pending results... 30575 1726867574.57559: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867574.57648: in run() - task 0affcac9-a3a5-e081-a588-000000000219 30575 1726867574.57658: variable 'ansible_search_path' from source: unknown 30575 1726867574.57662: variable 'ansible_search_path' from source: unknown 30575 1726867574.57692: calling self._execute() 30575 1726867574.57753: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867574.57757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867574.57765: variable 'omit' from source: magic vars 30575 1726867574.58030: variable 'ansible_distribution_major_version' from source: facts 30575 1726867574.58038: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867574.58119: variable 'network_provider' from source: set_fact 30575 1726867574.58125: Evaluated conditional (network_provider == "nm"): True 30575 1726867574.58188: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867574.58249: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867574.58361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867574.59742: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867574.59788: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867574.59815: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867574.59840: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867574.59860: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867574.60083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867574.60087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867574.60089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867574.60092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867574.60094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867574.60115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867574.60144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867574.60176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867574.60223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867574.60330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867574.60789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867574.60982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867574.60986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867574.60989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867574.60991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867574.61065: variable 'network_connections' from source: include params 30575 1726867574.61092: variable 'interface' from source: play vars 30575 1726867574.61174: variable 'interface' from source: play vars 30575 1726867574.61256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867574.61432: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867574.61475: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867574.61520: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867574.61556: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867574.61608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867574.61636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867574.61670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867574.61706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867574.61763: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867574.62022: variable 'network_connections' from source: include params 30575 1726867574.62034: variable 'interface' from source: play vars 30575 1726867574.62103: variable 'interface' from source: play vars 30575 1726867574.62147: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867574.62156: when evaluation is False, skipping this task 30575 1726867574.62164: _execute() done 30575 1726867574.62172: dumping result to json 30575 1726867574.62186: done dumping result, returning 30575 1726867574.62199: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000000219] 30575 1726867574.62292: sending task result for task 0affcac9-a3a5-e081-a588-000000000219 30575 1726867574.62365: done sending task result for task 0affcac9-a3a5-e081-a588-000000000219 30575 1726867574.62369: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867574.62445: no more pending results, returning what we have 30575 1726867574.62453: results queue empty 30575 1726867574.62454: checking for any_errors_fatal 30575 1726867574.62491: done checking for any_errors_fatal 30575 1726867574.62493: checking for max_fail_percentage 30575 1726867574.62495: done checking for max_fail_percentage 30575 1726867574.62496: checking to see if all hosts have failed and the running result is not ok 30575 1726867574.62497: done checking to see if all hosts have failed 30575 1726867574.62498: getting the remaining hosts for this loop 30575 1726867574.62499: done getting the remaining hosts for this loop 30575 1726867574.62504: getting the next task for host managed_node3 30575 1726867574.62513: done getting next task for host managed_node3 30575 1726867574.62517: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867574.62523: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867574.62539: getting variables 30575 1726867574.62541: in VariableManager get_vars() 30575 1726867574.62782: Calling all_inventory to load vars for managed_node3 30575 1726867574.62785: Calling groups_inventory to load vars for managed_node3 30575 1726867574.62788: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867574.62797: Calling all_plugins_play to load vars for managed_node3 30575 1726867574.62799: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867574.62802: Calling groups_plugins_play to load vars for managed_node3 30575 1726867574.64222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867574.66622: done with get_vars() 30575 1726867574.66646: done getting variables 30575 1726867574.66713: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:26:14 -0400 (0:00:00.095) 0:00:10.045 ****** 30575 1726867574.66748: entering _queue_task() for managed_node3/service 30575 1726867574.67096: worker is 1 (out of 1 available) 30575 1726867574.67131: exiting _queue_task() for managed_node3/service 30575 1726867574.67144: done queuing things up, now waiting for results queue to drain 30575 1726867574.67145: waiting for pending results... 30575 1726867574.67453: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867574.67660: in run() - task 0affcac9-a3a5-e081-a588-00000000021a 30575 1726867574.67664: variable 'ansible_search_path' from source: unknown 30575 1726867574.67667: variable 'ansible_search_path' from source: unknown 30575 1726867574.67671: calling self._execute() 30575 1726867574.67732: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867574.67745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867574.67767: variable 'omit' from source: magic vars 30575 1726867574.68159: variable 'ansible_distribution_major_version' from source: facts 30575 1726867574.68175: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867574.68311: variable 'network_provider' from source: set_fact 30575 1726867574.68382: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867574.68386: when evaluation is False, skipping this task 30575 1726867574.68388: _execute() done 30575 1726867574.68390: dumping result to json 30575 1726867574.68393: done dumping result, returning 30575 1726867574.68395: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-00000000021a] 30575 1726867574.68397: sending task result for task 0affcac9-a3a5-e081-a588-00000000021a skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867574.68569: no more pending results, returning what we have 30575 1726867574.68574: results queue empty 30575 1726867574.68575: checking for any_errors_fatal 30575 1726867574.68586: done checking for any_errors_fatal 30575 1726867574.68587: checking for max_fail_percentage 30575 1726867574.68589: done checking for max_fail_percentage 30575 1726867574.68590: checking to see if all hosts have failed and the running result is not ok 30575 1726867574.68591: done checking to see if all hosts have failed 30575 1726867574.68591: getting the remaining hosts for this loop 30575 1726867574.68593: done getting the remaining hosts for this loop 30575 1726867574.68597: getting the next task for host managed_node3 30575 1726867574.68606: done getting next task for host managed_node3 30575 1726867574.68610: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867574.68616: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867574.68636: getting variables 30575 1726867574.68638: in VariableManager get_vars() 30575 1726867574.68674: Calling all_inventory to load vars for managed_node3 30575 1726867574.68678: Calling groups_inventory to load vars for managed_node3 30575 1726867574.68681: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867574.68693: Calling all_plugins_play to load vars for managed_node3 30575 1726867574.68697: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867574.68700: Calling groups_plugins_play to load vars for managed_node3 30575 1726867574.69296: done sending task result for task 0affcac9-a3a5-e081-a588-00000000021a 30575 1726867574.69299: WORKER PROCESS EXITING 30575 1726867574.70406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867574.72628: done with get_vars() 30575 1726867574.72647: done getting variables 30575 1726867574.73016: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:26:14 -0400 (0:00:00.063) 0:00:10.108 ****** 30575 1726867574.73053: entering _queue_task() for managed_node3/copy 30575 1726867574.74211: worker is 1 (out of 1 available) 30575 1726867574.74220: exiting _queue_task() for managed_node3/copy 30575 1726867574.74232: done queuing things up, now waiting for results queue to drain 30575 1726867574.74233: waiting for pending results... 30575 1726867574.74896: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867574.75342: in run() - task 0affcac9-a3a5-e081-a588-00000000021b 30575 1726867574.75345: variable 'ansible_search_path' from source: unknown 30575 1726867574.75348: variable 'ansible_search_path' from source: unknown 30575 1726867574.75351: calling self._execute() 30575 1726867574.75354: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867574.75559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867574.75578: variable 'omit' from source: magic vars 30575 1726867574.76559: variable 'ansible_distribution_major_version' from source: facts 30575 1726867574.76575: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867574.76772: variable 'network_provider' from source: set_fact 30575 1726867574.76839: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867574.76847: when evaluation is False, skipping this task 30575 1726867574.76855: _execute() done 30575 1726867574.76863: dumping result to json 30575 1726867574.76945: done dumping result, returning 30575 1726867574.77048: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-00000000021b] 30575 1726867574.77052: sending task result for task 0affcac9-a3a5-e081-a588-00000000021b 30575 1726867574.77130: done sending task result for task 0affcac9-a3a5-e081-a588-00000000021b 30575 1726867574.77133: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867574.77201: no more pending results, returning what we have 30575 1726867574.77205: results queue empty 30575 1726867574.77206: checking for any_errors_fatal 30575 1726867574.77213: done checking for any_errors_fatal 30575 1726867574.77214: checking for max_fail_percentage 30575 1726867574.77216: done checking for max_fail_percentage 30575 1726867574.77216: checking to see if all hosts have failed and the running result is not ok 30575 1726867574.77217: done checking to see if all hosts have failed 30575 1726867574.77218: getting the remaining hosts for this loop 30575 1726867574.77220: done getting the remaining hosts for this loop 30575 1726867574.77227: getting the next task for host managed_node3 30575 1726867574.77235: done getting next task for host managed_node3 30575 1726867574.77239: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867574.77245: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867574.77261: getting variables 30575 1726867574.77263: in VariableManager get_vars() 30575 1726867574.77302: Calling all_inventory to load vars for managed_node3 30575 1726867574.77305: Calling groups_inventory to load vars for managed_node3 30575 1726867574.77307: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867574.77319: Calling all_plugins_play to load vars for managed_node3 30575 1726867574.77322: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867574.77328: Calling groups_plugins_play to load vars for managed_node3 30575 1726867574.80524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867574.83748: done with get_vars() 30575 1726867574.83781: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:26:14 -0400 (0:00:00.108) 0:00:10.216 ****** 30575 1726867574.83872: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867574.83874: Creating lock for fedora.linux_system_roles.network_connections 30575 1726867574.84637: worker is 1 (out of 1 available) 30575 1726867574.84649: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867574.84662: done queuing things up, now waiting for results queue to drain 30575 1726867574.84664: waiting for pending results... 30575 1726867574.85296: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867574.85306: in run() - task 0affcac9-a3a5-e081-a588-00000000021c 30575 1726867574.85310: variable 'ansible_search_path' from source: unknown 30575 1726867574.85313: variable 'ansible_search_path' from source: unknown 30575 1726867574.85518: calling self._execute() 30575 1726867574.85610: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867574.85899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867574.85903: variable 'omit' from source: magic vars 30575 1726867574.86369: variable 'ansible_distribution_major_version' from source: facts 30575 1726867574.86683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867574.86687: variable 'omit' from source: magic vars 30575 1726867574.86689: variable 'omit' from source: magic vars 30575 1726867574.86822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867574.91183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867574.91187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867574.91189: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867574.91191: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867574.91587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867574.91591: variable 'network_provider' from source: set_fact 30575 1726867574.91811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867574.91842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867574.91876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867574.91923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867574.91959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867574.92218: variable 'omit' from source: magic vars 30575 1726867574.92340: variable 'omit' from source: magic vars 30575 1726867574.92459: variable 'network_connections' from source: include params 30575 1726867574.92595: variable 'interface' from source: play vars 30575 1726867574.92726: variable 'interface' from source: play vars 30575 1726867574.92925: variable 'omit' from source: magic vars 30575 1726867574.93108: variable '__lsr_ansible_managed' from source: task vars 30575 1726867574.93168: variable '__lsr_ansible_managed' from source: task vars 30575 1726867574.93586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867574.93996: Loaded config def from plugin (lookup/template) 30575 1726867574.94283: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867574.94286: File lookup term: get_ansible_managed.j2 30575 1726867574.94289: variable 'ansible_search_path' from source: unknown 30575 1726867574.94293: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867574.94297: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867574.94300: variable 'ansible_search_path' from source: unknown 30575 1726867575.07266: variable 'ansible_managed' from source: unknown 30575 1726867575.07414: variable 'omit' from source: magic vars 30575 1726867575.07444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867575.07594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867575.07611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867575.07704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867575.07715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867575.07746: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867575.07749: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867575.07752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867575.07849: Set connection var ansible_pipelining to False 30575 1726867575.07853: Set connection var ansible_shell_type to sh 30575 1726867575.07859: Set connection var ansible_shell_executable to /bin/sh 30575 1726867575.07863: Set connection var ansible_timeout to 10 30575 1726867575.07869: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867575.07878: Set connection var ansible_connection to ssh 30575 1726867575.08300: variable 'ansible_shell_executable' from source: unknown 30575 1726867575.08303: variable 'ansible_connection' from source: unknown 30575 1726867575.08305: variable 'ansible_module_compression' from source: unknown 30575 1726867575.08307: variable 'ansible_shell_type' from source: unknown 30575 1726867575.08308: variable 'ansible_shell_executable' from source: unknown 30575 1726867575.08311: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867575.08313: variable 'ansible_pipelining' from source: unknown 30575 1726867575.08315: variable 'ansible_timeout' from source: unknown 30575 1726867575.08316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867575.08465: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867575.08476: variable 'omit' from source: magic vars 30575 1726867575.08502: starting attempt loop 30575 1726867575.08505: running the handler 30575 1726867575.08519: _low_level_execute_command(): starting 30575 1726867575.08527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867575.10085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867575.10231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867575.10278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867575.11958: stdout chunk (state=3): >>>/root <<< 30575 1726867575.12219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867575.12223: stdout chunk (state=3): >>><<< 30575 1726867575.12225: stderr chunk (state=3): >>><<< 30575 1726867575.12267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867575.12270: _low_level_execute_command(): starting 30575 1726867575.12273: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295 `" && echo ansible-tmp-1726867575.122431-31050-16945831672295="` echo /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295 `" ) && sleep 0' 30575 1726867575.13492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867575.13507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867575.13523: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867575.13602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867575.13752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867575.13865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867575.15754: stdout chunk (state=3): >>>ansible-tmp-1726867575.122431-31050-16945831672295=/root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295 <<< 30575 1726867575.15916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867575.15920: stdout chunk (state=3): >>><<< 30575 1726867575.15922: stderr chunk (state=3): >>><<< 30575 1726867575.16037: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867575.122431-31050-16945831672295=/root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867575.16051: variable 'ansible_module_compression' from source: unknown 30575 1726867575.16287: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 30575 1726867575.16290: ANSIBALLZ: Acquiring lock 30575 1726867575.16293: ANSIBALLZ: Lock acquired: 140240647017136 30575 1726867575.16297: ANSIBALLZ: Creating module 30575 1726867575.49916: ANSIBALLZ: Writing module into payload 30575 1726867575.50267: ANSIBALLZ: Writing module 30575 1726867575.50300: ANSIBALLZ: Renaming module 30575 1726867575.50312: ANSIBALLZ: Done creating module 30575 1726867575.50342: variable 'ansible_facts' from source: unknown 30575 1726867575.50459: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/AnsiballZ_network_connections.py 30575 1726867575.50598: Sending initial data 30575 1726867575.50718: Sent initial data (166 bytes) 30575 1726867575.51371: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867575.51805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867575.51870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867575.53518: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867575.53537: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30575 1726867575.53601: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30575 1726867575.53620: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867575.53641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867575.53684: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpto4h8o3h /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/AnsiballZ_network_connections.py <<< 30575 1726867575.53698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/AnsiballZ_network_connections.py" <<< 30575 1726867575.53745: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpto4h8o3h" to remote "/root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/AnsiballZ_network_connections.py" <<< 30575 1726867575.55688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867575.55692: stdout chunk (state=3): >>><<< 30575 1726867575.55694: stderr chunk (state=3): >>><<< 30575 1726867575.55696: done transferring module to remote 30575 1726867575.55698: _low_level_execute_command(): starting 30575 1726867575.55701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/ /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/AnsiballZ_network_connections.py && sleep 0' 30575 1726867575.56800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867575.56809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867575.56820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867575.56834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867575.56846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867575.56853: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867575.56863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867575.56881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867575.57184: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867575.57226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867575.59070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867575.59284: stderr chunk (state=3): >>><<< 30575 1726867575.59288: stdout chunk (state=3): >>><<< 30575 1726867575.59291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867575.59293: _low_level_execute_command(): starting 30575 1726867575.59295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/AnsiballZ_network_connections.py && sleep 0' 30575 1726867575.60216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867575.60226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867575.60245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867575.60384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867575.60389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867575.60434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867575.60440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867575.60599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867575.60659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867575.88119: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867575.91589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867575.91594: stdout chunk (state=3): >>><<< 30575 1726867575.91596: stderr chunk (state=3): >>><<< 30575 1726867575.91599: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867575.91790: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867575.91794: _low_level_execute_command(): starting 30575 1726867575.91797: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867575.122431-31050-16945831672295/ > /dev/null 2>&1 && sleep 0' 30575 1726867575.93152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867575.93307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867575.93461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867575.93691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867575.95653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867575.95658: stdout chunk (state=3): >>><<< 30575 1726867575.95660: stderr chunk (state=3): >>><<< 30575 1726867575.95663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867575.95665: handler run complete 30575 1726867575.95865: attempt loop complete, returning result 30575 1726867575.95868: _execute() done 30575 1726867575.95871: dumping result to json 30575 1726867575.95872: done dumping result, returning 30575 1726867575.95875: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-00000000021c] 30575 1726867575.95879: sending task result for task 0affcac9-a3a5-e081-a588-00000000021c 30575 1726867575.95956: done sending task result for task 0affcac9-a3a5-e081-a588-00000000021c 30575 1726867575.95959: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f 30575 1726867575.96073: no more pending results, returning what we have 30575 1726867575.96185: results queue empty 30575 1726867575.96186: checking for any_errors_fatal 30575 1726867575.96195: done checking for any_errors_fatal 30575 1726867575.96200: checking for max_fail_percentage 30575 1726867575.96202: done checking for max_fail_percentage 30575 1726867575.96203: checking to see if all hosts have failed and the running result is not ok 30575 1726867575.96204: done checking to see if all hosts have failed 30575 1726867575.96205: getting the remaining hosts for this loop 30575 1726867575.96206: done getting the remaining hosts for this loop 30575 1726867575.96210: getting the next task for host managed_node3 30575 1726867575.96219: done getting next task for host managed_node3 30575 1726867575.96223: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867575.96228: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867575.96240: getting variables 30575 1726867575.96242: in VariableManager get_vars() 30575 1726867575.96604: Calling all_inventory to load vars for managed_node3 30575 1726867575.96607: Calling groups_inventory to load vars for managed_node3 30575 1726867575.96610: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867575.96620: Calling all_plugins_play to load vars for managed_node3 30575 1726867575.96623: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867575.96626: Calling groups_plugins_play to load vars for managed_node3 30575 1726867575.99846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867576.05048: done with get_vars() 30575 1726867576.05073: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:26:16 -0400 (0:00:01.213) 0:00:11.430 ****** 30575 1726867576.05275: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867576.05279: Creating lock for fedora.linux_system_roles.network_state 30575 1726867576.06396: worker is 1 (out of 1 available) 30575 1726867576.06408: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867576.06423: done queuing things up, now waiting for results queue to drain 30575 1726867576.06425: waiting for pending results... 30575 1726867576.07200: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867576.07205: in run() - task 0affcac9-a3a5-e081-a588-00000000021d 30575 1726867576.07315: variable 'ansible_search_path' from source: unknown 30575 1726867576.07319: variable 'ansible_search_path' from source: unknown 30575 1726867576.07357: calling self._execute() 30575 1726867576.07472: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.07511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.07515: variable 'omit' from source: magic vars 30575 1726867576.08554: variable 'ansible_distribution_major_version' from source: facts 30575 1726867576.08569: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867576.08955: variable 'network_state' from source: role '' defaults 30575 1726867576.09135: Evaluated conditional (network_state != {}): False 30575 1726867576.09139: when evaluation is False, skipping this task 30575 1726867576.09141: _execute() done 30575 1726867576.09144: dumping result to json 30575 1726867576.09146: done dumping result, returning 30575 1726867576.09148: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-00000000021d] 30575 1726867576.09151: sending task result for task 0affcac9-a3a5-e081-a588-00000000021d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867576.09289: no more pending results, returning what we have 30575 1726867576.09293: results queue empty 30575 1726867576.09294: checking for any_errors_fatal 30575 1726867576.09308: done checking for any_errors_fatal 30575 1726867576.09308: checking for max_fail_percentage 30575 1726867576.09311: done checking for max_fail_percentage 30575 1726867576.09311: checking to see if all hosts have failed and the running result is not ok 30575 1726867576.09312: done checking to see if all hosts have failed 30575 1726867576.09313: getting the remaining hosts for this loop 30575 1726867576.09315: done getting the remaining hosts for this loop 30575 1726867576.09319: getting the next task for host managed_node3 30575 1726867576.09327: done getting next task for host managed_node3 30575 1726867576.09332: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867576.09339: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867576.09354: getting variables 30575 1726867576.09355: in VariableManager get_vars() 30575 1726867576.09392: Calling all_inventory to load vars for managed_node3 30575 1726867576.09395: Calling groups_inventory to load vars for managed_node3 30575 1726867576.09398: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867576.09410: Calling all_plugins_play to load vars for managed_node3 30575 1726867576.09412: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867576.09415: Calling groups_plugins_play to load vars for managed_node3 30575 1726867576.10783: done sending task result for task 0affcac9-a3a5-e081-a588-00000000021d 30575 1726867576.10788: WORKER PROCESS EXITING 30575 1726867576.14246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867576.17412: done with get_vars() 30575 1726867576.17496: done getting variables 30575 1726867576.17675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:26:16 -0400 (0:00:00.124) 0:00:11.554 ****** 30575 1726867576.17712: entering _queue_task() for managed_node3/debug 30575 1726867576.18357: worker is 1 (out of 1 available) 30575 1726867576.18372: exiting _queue_task() for managed_node3/debug 30575 1726867576.18533: done queuing things up, now waiting for results queue to drain 30575 1726867576.18536: waiting for pending results... 30575 1726867576.19294: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867576.19866: in run() - task 0affcac9-a3a5-e081-a588-00000000021e 30575 1726867576.19870: variable 'ansible_search_path' from source: unknown 30575 1726867576.19873: variable 'ansible_search_path' from source: unknown 30575 1726867576.19876: calling self._execute() 30575 1726867576.19915: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.19985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.20096: variable 'omit' from source: magic vars 30575 1726867576.21022: variable 'ansible_distribution_major_version' from source: facts 30575 1726867576.21385: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867576.21388: variable 'omit' from source: magic vars 30575 1726867576.21390: variable 'omit' from source: magic vars 30575 1726867576.21682: variable 'omit' from source: magic vars 30575 1726867576.21686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867576.21689: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867576.21691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867576.22130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867576.22134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867576.22136: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867576.22139: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.22141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.22199: Set connection var ansible_pipelining to False 30575 1726867576.22563: Set connection var ansible_shell_type to sh 30575 1726867576.22566: Set connection var ansible_shell_executable to /bin/sh 30575 1726867576.22569: Set connection var ansible_timeout to 10 30575 1726867576.22571: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867576.22573: Set connection var ansible_connection to ssh 30575 1726867576.22575: variable 'ansible_shell_executable' from source: unknown 30575 1726867576.22579: variable 'ansible_connection' from source: unknown 30575 1726867576.22581: variable 'ansible_module_compression' from source: unknown 30575 1726867576.22584: variable 'ansible_shell_type' from source: unknown 30575 1726867576.22586: variable 'ansible_shell_executable' from source: unknown 30575 1726867576.22588: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.22590: variable 'ansible_pipelining' from source: unknown 30575 1726867576.22592: variable 'ansible_timeout' from source: unknown 30575 1726867576.22594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.22876: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867576.23482: variable 'omit' from source: magic vars 30575 1726867576.23485: starting attempt loop 30575 1726867576.23489: running the handler 30575 1726867576.23491: variable '__network_connections_result' from source: set_fact 30575 1726867576.23493: handler run complete 30575 1726867576.23825: attempt loop complete, returning result 30575 1726867576.23828: _execute() done 30575 1726867576.23834: dumping result to json 30575 1726867576.23836: done dumping result, returning 30575 1726867576.23839: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-00000000021e] 30575 1726867576.23841: sending task result for task 0affcac9-a3a5-e081-a588-00000000021e 30575 1726867576.23919: done sending task result for task 0affcac9-a3a5-e081-a588-00000000021e 30575 1726867576.23922: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f" ] } 30575 1726867576.23998: no more pending results, returning what we have 30575 1726867576.24002: results queue empty 30575 1726867576.24003: checking for any_errors_fatal 30575 1726867576.24012: done checking for any_errors_fatal 30575 1726867576.24013: checking for max_fail_percentage 30575 1726867576.24015: done checking for max_fail_percentage 30575 1726867576.24016: checking to see if all hosts have failed and the running result is not ok 30575 1726867576.24017: done checking to see if all hosts have failed 30575 1726867576.24018: getting the remaining hosts for this loop 30575 1726867576.24019: done getting the remaining hosts for this loop 30575 1726867576.24026: getting the next task for host managed_node3 30575 1726867576.24034: done getting next task for host managed_node3 30575 1726867576.24037: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867576.24153: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867576.24166: getting variables 30575 1726867576.24168: in VariableManager get_vars() 30575 1726867576.24206: Calling all_inventory to load vars for managed_node3 30575 1726867576.24208: Calling groups_inventory to load vars for managed_node3 30575 1726867576.24211: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867576.24221: Calling all_plugins_play to load vars for managed_node3 30575 1726867576.24226: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867576.24229: Calling groups_plugins_play to load vars for managed_node3 30575 1726867576.26905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867576.30209: done with get_vars() 30575 1726867576.30235: done getting variables 30575 1726867576.30412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:26:16 -0400 (0:00:00.127) 0:00:11.682 ****** 30575 1726867576.30453: entering _queue_task() for managed_node3/debug 30575 1726867576.31031: worker is 1 (out of 1 available) 30575 1726867576.31480: exiting _queue_task() for managed_node3/debug 30575 1726867576.31491: done queuing things up, now waiting for results queue to drain 30575 1726867576.31492: waiting for pending results... 30575 1726867576.31731: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867576.32126: in run() - task 0affcac9-a3a5-e081-a588-00000000021f 30575 1726867576.32130: variable 'ansible_search_path' from source: unknown 30575 1726867576.32133: variable 'ansible_search_path' from source: unknown 30575 1726867576.32136: calling self._execute() 30575 1726867576.32230: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.32473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.32478: variable 'omit' from source: magic vars 30575 1726867576.33082: variable 'ansible_distribution_major_version' from source: facts 30575 1726867576.33285: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867576.33288: variable 'omit' from source: magic vars 30575 1726867576.33291: variable 'omit' from source: magic vars 30575 1726867576.33395: variable 'omit' from source: magic vars 30575 1726867576.33549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867576.33584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867576.33602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867576.33619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867576.33633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867576.33779: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867576.33783: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.33786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.34005: Set connection var ansible_pipelining to False 30575 1726867576.34008: Set connection var ansible_shell_type to sh 30575 1726867576.34011: Set connection var ansible_shell_executable to /bin/sh 30575 1726867576.34083: Set connection var ansible_timeout to 10 30575 1726867576.34086: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867576.34088: Set connection var ansible_connection to ssh 30575 1726867576.34090: variable 'ansible_shell_executable' from source: unknown 30575 1726867576.34093: variable 'ansible_connection' from source: unknown 30575 1726867576.34095: variable 'ansible_module_compression' from source: unknown 30575 1726867576.34097: variable 'ansible_shell_type' from source: unknown 30575 1726867576.34099: variable 'ansible_shell_executable' from source: unknown 30575 1726867576.34101: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.34103: variable 'ansible_pipelining' from source: unknown 30575 1726867576.34105: variable 'ansible_timeout' from source: unknown 30575 1726867576.34108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.34442: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867576.34451: variable 'omit' from source: magic vars 30575 1726867576.34457: starting attempt loop 30575 1726867576.34460: running the handler 30575 1726867576.34767: variable '__network_connections_result' from source: set_fact 30575 1726867576.34770: variable '__network_connections_result' from source: set_fact 30575 1726867576.34885: handler run complete 30575 1726867576.34910: attempt loop complete, returning result 30575 1726867576.34913: _execute() done 30575 1726867576.34915: dumping result to json 30575 1726867576.34918: done dumping result, returning 30575 1726867576.34929: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-00000000021f] 30575 1726867576.34934: sending task result for task 0affcac9-a3a5-e081-a588-00000000021f 30575 1726867576.35155: done sending task result for task 0affcac9-a3a5-e081-a588-00000000021f 30575 1726867576.35159: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f" ] } } 30575 1726867576.35367: no more pending results, returning what we have 30575 1726867576.35371: results queue empty 30575 1726867576.35374: checking for any_errors_fatal 30575 1726867576.35382: done checking for any_errors_fatal 30575 1726867576.35383: checking for max_fail_percentage 30575 1726867576.35385: done checking for max_fail_percentage 30575 1726867576.35386: checking to see if all hosts have failed and the running result is not ok 30575 1726867576.35387: done checking to see if all hosts have failed 30575 1726867576.35388: getting the remaining hosts for this loop 30575 1726867576.35390: done getting the remaining hosts for this loop 30575 1726867576.35394: getting the next task for host managed_node3 30575 1726867576.35402: done getting next task for host managed_node3 30575 1726867576.35406: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867576.35411: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867576.35421: getting variables 30575 1726867576.35423: in VariableManager get_vars() 30575 1726867576.35465: Calling all_inventory to load vars for managed_node3 30575 1726867576.35468: Calling groups_inventory to load vars for managed_node3 30575 1726867576.35470: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867576.35717: Calling all_plugins_play to load vars for managed_node3 30575 1726867576.35722: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867576.35726: Calling groups_plugins_play to load vars for managed_node3 30575 1726867576.38548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867576.42466: done with get_vars() 30575 1726867576.42592: done getting variables 30575 1726867576.42652: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:26:16 -0400 (0:00:00.123) 0:00:11.805 ****** 30575 1726867576.42815: entering _queue_task() for managed_node3/debug 30575 1726867576.43341: worker is 1 (out of 1 available) 30575 1726867576.43356: exiting _queue_task() for managed_node3/debug 30575 1726867576.43368: done queuing things up, now waiting for results queue to drain 30575 1726867576.43370: waiting for pending results... 30575 1726867576.43840: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867576.43845: in run() - task 0affcac9-a3a5-e081-a588-000000000220 30575 1726867576.43849: variable 'ansible_search_path' from source: unknown 30575 1726867576.43852: variable 'ansible_search_path' from source: unknown 30575 1726867576.43854: calling self._execute() 30575 1726867576.44151: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.44155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.44158: variable 'omit' from source: magic vars 30575 1726867576.44596: variable 'ansible_distribution_major_version' from source: facts 30575 1726867576.44606: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867576.44838: variable 'network_state' from source: role '' defaults 30575 1726867576.44849: Evaluated conditional (network_state != {}): False 30575 1726867576.44852: when evaluation is False, skipping this task 30575 1726867576.44855: _execute() done 30575 1726867576.44858: dumping result to json 30575 1726867576.44860: done dumping result, returning 30575 1726867576.44870: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000000220] 30575 1726867576.44876: sending task result for task 0affcac9-a3a5-e081-a588-000000000220 30575 1726867576.45081: done sending task result for task 0affcac9-a3a5-e081-a588-000000000220 30575 1726867576.45085: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867576.45160: no more pending results, returning what we have 30575 1726867576.45164: results queue empty 30575 1726867576.45165: checking for any_errors_fatal 30575 1726867576.45176: done checking for any_errors_fatal 30575 1726867576.45179: checking for max_fail_percentage 30575 1726867576.45181: done checking for max_fail_percentage 30575 1726867576.45182: checking to see if all hosts have failed and the running result is not ok 30575 1726867576.45183: done checking to see if all hosts have failed 30575 1726867576.45183: getting the remaining hosts for this loop 30575 1726867576.45185: done getting the remaining hosts for this loop 30575 1726867576.45189: getting the next task for host managed_node3 30575 1726867576.45197: done getting next task for host managed_node3 30575 1726867576.45201: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867576.45206: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867576.45222: getting variables 30575 1726867576.45224: in VariableManager get_vars() 30575 1726867576.45258: Calling all_inventory to load vars for managed_node3 30575 1726867576.45261: Calling groups_inventory to load vars for managed_node3 30575 1726867576.45263: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867576.45275: Calling all_plugins_play to load vars for managed_node3 30575 1726867576.45602: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867576.45608: Calling groups_plugins_play to load vars for managed_node3 30575 1726867576.48258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867576.50315: done with get_vars() 30575 1726867576.50334: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:26:16 -0400 (0:00:00.076) 0:00:11.881 ****** 30575 1726867576.50430: entering _queue_task() for managed_node3/ping 30575 1726867576.50432: Creating lock for ping 30575 1726867576.50760: worker is 1 (out of 1 available) 30575 1726867576.50772: exiting _queue_task() for managed_node3/ping 30575 1726867576.50787: done queuing things up, now waiting for results queue to drain 30575 1726867576.50788: waiting for pending results... 30575 1726867576.51360: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867576.51679: in run() - task 0affcac9-a3a5-e081-a588-000000000221 30575 1726867576.51684: variable 'ansible_search_path' from source: unknown 30575 1726867576.51686: variable 'ansible_search_path' from source: unknown 30575 1726867576.51690: calling self._execute() 30575 1726867576.51692: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.51984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.51987: variable 'omit' from source: magic vars 30575 1726867576.52799: variable 'ansible_distribution_major_version' from source: facts 30575 1726867576.52889: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867576.52901: variable 'omit' from source: magic vars 30575 1726867576.52967: variable 'omit' from source: magic vars 30575 1726867576.53183: variable 'omit' from source: magic vars 30575 1726867576.53186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867576.53227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867576.53306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867576.53333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867576.53640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867576.53643: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867576.53645: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.53647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.53686: Set connection var ansible_pipelining to False 30575 1726867576.53693: Set connection var ansible_shell_type to sh 30575 1726867576.53702: Set connection var ansible_shell_executable to /bin/sh 30575 1726867576.53710: Set connection var ansible_timeout to 10 30575 1726867576.53718: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867576.53730: Set connection var ansible_connection to ssh 30575 1726867576.53967: variable 'ansible_shell_executable' from source: unknown 30575 1726867576.53970: variable 'ansible_connection' from source: unknown 30575 1726867576.53973: variable 'ansible_module_compression' from source: unknown 30575 1726867576.53974: variable 'ansible_shell_type' from source: unknown 30575 1726867576.53976: variable 'ansible_shell_executable' from source: unknown 30575 1726867576.53979: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867576.53981: variable 'ansible_pipelining' from source: unknown 30575 1726867576.53982: variable 'ansible_timeout' from source: unknown 30575 1726867576.53984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867576.54306: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867576.54326: variable 'omit' from source: magic vars 30575 1726867576.54338: starting attempt loop 30575 1726867576.54346: running the handler 30575 1726867576.54363: _low_level_execute_command(): starting 30575 1726867576.54376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867576.56462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867576.56662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867576.56700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867576.56839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867576.58465: stdout chunk (state=3): >>>/root <<< 30575 1726867576.58616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867576.58641: stdout chunk (state=3): >>><<< 30575 1726867576.58675: stderr chunk (state=3): >>><<< 30575 1726867576.58713: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867576.58838: _low_level_execute_command(): starting 30575 1726867576.58842: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811 `" && echo ansible-tmp-1726867576.587259-31097-74912599467811="` echo /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811 `" ) && sleep 0' 30575 1726867576.59489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867576.59502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867576.59505: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867576.59519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867576.59602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867576.59686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867576.59739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867576.61592: stdout chunk (state=3): >>>ansible-tmp-1726867576.587259-31097-74912599467811=/root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811 <<< 30575 1726867576.61756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867576.61759: stdout chunk (state=3): >>><<< 30575 1726867576.61762: stderr chunk (state=3): >>><<< 30575 1726867576.61988: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867576.587259-31097-74912599467811=/root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867576.61992: variable 'ansible_module_compression' from source: unknown 30575 1726867576.61994: ANSIBALLZ: Using lock for ping 30575 1726867576.61996: ANSIBALLZ: Acquiring lock 30575 1726867576.61998: ANSIBALLZ: Lock acquired: 140240644551312 30575 1726867576.62000: ANSIBALLZ: Creating module 30575 1726867576.85143: ANSIBALLZ: Writing module into payload 30575 1726867576.85201: ANSIBALLZ: Writing module 30575 1726867576.85226: ANSIBALLZ: Renaming module 30575 1726867576.85262: ANSIBALLZ: Done creating module 30575 1726867576.85268: variable 'ansible_facts' from source: unknown 30575 1726867576.85401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/AnsiballZ_ping.py 30575 1726867576.85746: Sending initial data 30575 1726867576.85750: Sent initial data (151 bytes) 30575 1726867576.86700: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867576.86711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867576.86721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867576.86735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867576.86748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867576.86755: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867576.86792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867576.86796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867576.86902: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867576.86910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867576.86972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867576.88767: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867576.88771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867576.88810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpoorpods9 /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/AnsiballZ_ping.py <<< 30575 1726867576.88813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/AnsiballZ_ping.py" <<< 30575 1726867576.88901: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpoorpods9" to remote "/root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/AnsiballZ_ping.py" <<< 30575 1726867576.89672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867576.89705: stderr chunk (state=3): >>><<< 30575 1726867576.89717: stdout chunk (state=3): >>><<< 30575 1726867576.89775: done transferring module to remote 30575 1726867576.89795: _low_level_execute_command(): starting 30575 1726867576.89811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/ /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/AnsiballZ_ping.py && sleep 0' 30575 1726867576.90259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867576.90265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867576.90305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867576.90308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867576.90314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867576.90316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867576.90318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867576.90320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867576.90364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867576.90367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867576.90420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867576.92242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867576.92285: stderr chunk (state=3): >>><<< 30575 1726867576.92288: stdout chunk (state=3): >>><<< 30575 1726867576.92335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867576.92339: _low_level_execute_command(): starting 30575 1726867576.92341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/AnsiballZ_ping.py && sleep 0' 30575 1726867576.92854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867576.92927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867576.92966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867576.92970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867576.93029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.07892: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867577.09115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867577.09135: stderr chunk (state=3): >>><<< 30575 1726867577.09138: stdout chunk (state=3): >>><<< 30575 1726867577.09151: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867577.09170: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867577.09180: _low_level_execute_command(): starting 30575 1726867577.09184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867576.587259-31097-74912599467811/ > /dev/null 2>&1 && sleep 0' 30575 1726867577.09599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867577.09602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.09605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867577.09607: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867577.09609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.09648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.09673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.09707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.11516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.11538: stderr chunk (state=3): >>><<< 30575 1726867577.11542: stdout chunk (state=3): >>><<< 30575 1726867577.11553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867577.11559: handler run complete 30575 1726867577.11570: attempt loop complete, returning result 30575 1726867577.11573: _execute() done 30575 1726867577.11575: dumping result to json 30575 1726867577.11579: done dumping result, returning 30575 1726867577.11588: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-000000000221] 30575 1726867577.11594: sending task result for task 0affcac9-a3a5-e081-a588-000000000221 30575 1726867577.11681: done sending task result for task 0affcac9-a3a5-e081-a588-000000000221 30575 1726867577.11684: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867577.11763: no more pending results, returning what we have 30575 1726867577.11767: results queue empty 30575 1726867577.11768: checking for any_errors_fatal 30575 1726867577.11774: done checking for any_errors_fatal 30575 1726867577.11774: checking for max_fail_percentage 30575 1726867577.11776: done checking for max_fail_percentage 30575 1726867577.11779: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.11780: done checking to see if all hosts have failed 30575 1726867577.11780: getting the remaining hosts for this loop 30575 1726867577.11782: done getting the remaining hosts for this loop 30575 1726867577.11785: getting the next task for host managed_node3 30575 1726867577.11794: done getting next task for host managed_node3 30575 1726867577.11797: ^ task is: TASK: meta (role_complete) 30575 1726867577.11801: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.11812: getting variables 30575 1726867577.11814: in VariableManager get_vars() 30575 1726867577.11852: Calling all_inventory to load vars for managed_node3 30575 1726867577.11855: Calling groups_inventory to load vars for managed_node3 30575 1726867577.11857: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.11866: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.11869: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.11871: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.12694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.13562: done with get_vars() 30575 1726867577.13579: done getting variables 30575 1726867577.13640: done queuing things up, now waiting for results queue to drain 30575 1726867577.13642: results queue empty 30575 1726867577.13642: checking for any_errors_fatal 30575 1726867577.13644: done checking for any_errors_fatal 30575 1726867577.13644: checking for max_fail_percentage 30575 1726867577.13645: done checking for max_fail_percentage 30575 1726867577.13645: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.13646: done checking to see if all hosts have failed 30575 1726867577.13646: getting the remaining hosts for this loop 30575 1726867577.13647: done getting the remaining hosts for this loop 30575 1726867577.13648: getting the next task for host managed_node3 30575 1726867577.13651: done getting next task for host managed_node3 30575 1726867577.13652: ^ task is: TASK: Show result 30575 1726867577.13654: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.13656: getting variables 30575 1726867577.13656: in VariableManager get_vars() 30575 1726867577.13662: Calling all_inventory to load vars for managed_node3 30575 1726867577.13664: Calling groups_inventory to load vars for managed_node3 30575 1726867577.13665: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.13669: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.13670: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.13672: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.14379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.15232: done with get_vars() 30575 1726867577.15246: done getting variables 30575 1726867577.15276: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 17:26:17 -0400 (0:00:00.648) 0:00:12.530 ****** 30575 1726867577.15301: entering _queue_task() for managed_node3/debug 30575 1726867577.15554: worker is 1 (out of 1 available) 30575 1726867577.15567: exiting _queue_task() for managed_node3/debug 30575 1726867577.15580: done queuing things up, now waiting for results queue to drain 30575 1726867577.15582: waiting for pending results... 30575 1726867577.15754: running TaskExecutor() for managed_node3/TASK: Show result 30575 1726867577.15828: in run() - task 0affcac9-a3a5-e081-a588-00000000018f 30575 1726867577.15841: variable 'ansible_search_path' from source: unknown 30575 1726867577.15844: variable 'ansible_search_path' from source: unknown 30575 1726867577.15872: calling self._execute() 30575 1726867577.15939: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.15942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.15952: variable 'omit' from source: magic vars 30575 1726867577.16211: variable 'ansible_distribution_major_version' from source: facts 30575 1726867577.16220: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867577.16228: variable 'omit' from source: magic vars 30575 1726867577.16263: variable 'omit' from source: magic vars 30575 1726867577.16286: variable 'omit' from source: magic vars 30575 1726867577.16315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867577.16342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867577.16359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867577.16373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.16384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.16407: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867577.16410: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.16412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.16483: Set connection var ansible_pipelining to False 30575 1726867577.16487: Set connection var ansible_shell_type to sh 30575 1726867577.16492: Set connection var ansible_shell_executable to /bin/sh 30575 1726867577.16497: Set connection var ansible_timeout to 10 30575 1726867577.16502: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867577.16508: Set connection var ansible_connection to ssh 30575 1726867577.16528: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.16531: variable 'ansible_connection' from source: unknown 30575 1726867577.16534: variable 'ansible_module_compression' from source: unknown 30575 1726867577.16536: variable 'ansible_shell_type' from source: unknown 30575 1726867577.16538: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.16540: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.16542: variable 'ansible_pipelining' from source: unknown 30575 1726867577.16544: variable 'ansible_timeout' from source: unknown 30575 1726867577.16546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.16642: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867577.16650: variable 'omit' from source: magic vars 30575 1726867577.16656: starting attempt loop 30575 1726867577.16659: running the handler 30575 1726867577.16700: variable '__network_connections_result' from source: set_fact 30575 1726867577.16753: variable '__network_connections_result' from source: set_fact 30575 1726867577.16832: handler run complete 30575 1726867577.16848: attempt loop complete, returning result 30575 1726867577.16851: _execute() done 30575 1726867577.16854: dumping result to json 30575 1726867577.16858: done dumping result, returning 30575 1726867577.16865: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-e081-a588-00000000018f] 30575 1726867577.16870: sending task result for task 0affcac9-a3a5-e081-a588-00000000018f 30575 1726867577.16961: done sending task result for task 0affcac9-a3a5-e081-a588-00000000018f 30575 1726867577.16963: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f5796ae9-39ec-4c12-a218-e4d84e010b7f" ] } } 30575 1726867577.17062: no more pending results, returning what we have 30575 1726867577.17064: results queue empty 30575 1726867577.17065: checking for any_errors_fatal 30575 1726867577.17066: done checking for any_errors_fatal 30575 1726867577.17067: checking for max_fail_percentage 30575 1726867577.17068: done checking for max_fail_percentage 30575 1726867577.17069: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.17070: done checking to see if all hosts have failed 30575 1726867577.17071: getting the remaining hosts for this loop 30575 1726867577.17073: done getting the remaining hosts for this loop 30575 1726867577.17076: getting the next task for host managed_node3 30575 1726867577.17084: done getting next task for host managed_node3 30575 1726867577.17086: ^ task is: TASK: Asserts 30575 1726867577.17089: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.17093: getting variables 30575 1726867577.17094: in VariableManager get_vars() 30575 1726867577.17118: Calling all_inventory to load vars for managed_node3 30575 1726867577.17120: Calling groups_inventory to load vars for managed_node3 30575 1726867577.17125: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.17131: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.17133: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.17135: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.17858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.18707: done with get_vars() 30575 1726867577.18720: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:26:17 -0400 (0:00:00.034) 0:00:12.565 ****** 30575 1726867577.18790: entering _queue_task() for managed_node3/include_tasks 30575 1726867577.18988: worker is 1 (out of 1 available) 30575 1726867577.19002: exiting _queue_task() for managed_node3/include_tasks 30575 1726867577.19013: done queuing things up, now waiting for results queue to drain 30575 1726867577.19015: waiting for pending results... 30575 1726867577.19180: running TaskExecutor() for managed_node3/TASK: Asserts 30575 1726867577.19247: in run() - task 0affcac9-a3a5-e081-a588-000000000096 30575 1726867577.19257: variable 'ansible_search_path' from source: unknown 30575 1726867577.19260: variable 'ansible_search_path' from source: unknown 30575 1726867577.19298: variable 'lsr_assert' from source: include params 30575 1726867577.19465: variable 'lsr_assert' from source: include params 30575 1726867577.19518: variable 'omit' from source: magic vars 30575 1726867577.19616: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.19624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.19635: variable 'omit' from source: magic vars 30575 1726867577.19798: variable 'ansible_distribution_major_version' from source: facts 30575 1726867577.19804: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867577.19812: variable 'item' from source: unknown 30575 1726867577.19858: variable 'item' from source: unknown 30575 1726867577.19880: variable 'item' from source: unknown 30575 1726867577.19928: variable 'item' from source: unknown 30575 1726867577.20044: dumping result to json 30575 1726867577.20046: done dumping result, returning 30575 1726867577.20048: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-e081-a588-000000000096] 30575 1726867577.20050: sending task result for task 0affcac9-a3a5-e081-a588-000000000096 30575 1726867577.20088: done sending task result for task 0affcac9-a3a5-e081-a588-000000000096 30575 1726867577.20091: WORKER PROCESS EXITING 30575 1726867577.20111: no more pending results, returning what we have 30575 1726867577.20115: in VariableManager get_vars() 30575 1726867577.20143: Calling all_inventory to load vars for managed_node3 30575 1726867577.20146: Calling groups_inventory to load vars for managed_node3 30575 1726867577.20148: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.20158: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.20160: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.20163: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.21031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.22297: done with get_vars() 30575 1726867577.22309: variable 'ansible_search_path' from source: unknown 30575 1726867577.22310: variable 'ansible_search_path' from source: unknown 30575 1726867577.22336: we have included files to process 30575 1726867577.22337: generating all_blocks data 30575 1726867577.22338: done generating all_blocks data 30575 1726867577.22342: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867577.22343: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867577.22344: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867577.22471: in VariableManager get_vars() 30575 1726867577.22485: done with get_vars() 30575 1726867577.22650: done processing included file 30575 1726867577.22652: iterating over new_blocks loaded from include file 30575 1726867577.22653: in VariableManager get_vars() 30575 1726867577.22661: done with get_vars() 30575 1726867577.22662: filtering new block on tags 30575 1726867577.22694: done filtering new block on tags 30575 1726867577.22696: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=tasks/assert_profile_present.yml) 30575 1726867577.22699: extending task lists for all hosts with included blocks 30575 1726867577.23292: done extending task lists 30575 1726867577.23293: done processing included files 30575 1726867577.23293: results queue empty 30575 1726867577.23294: checking for any_errors_fatal 30575 1726867577.23297: done checking for any_errors_fatal 30575 1726867577.23297: checking for max_fail_percentage 30575 1726867577.23298: done checking for max_fail_percentage 30575 1726867577.23298: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.23299: done checking to see if all hosts have failed 30575 1726867577.23299: getting the remaining hosts for this loop 30575 1726867577.23300: done getting the remaining hosts for this loop 30575 1726867577.23302: getting the next task for host managed_node3 30575 1726867577.23305: done getting next task for host managed_node3 30575 1726867577.23306: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30575 1726867577.23307: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.23309: getting variables 30575 1726867577.23310: in VariableManager get_vars() 30575 1726867577.23315: Calling all_inventory to load vars for managed_node3 30575 1726867577.23317: Calling groups_inventory to load vars for managed_node3 30575 1726867577.23318: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.23322: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.23323: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.23326: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.24084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.25274: done with get_vars() 30575 1726867577.25290: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:26:17 -0400 (0:00:00.065) 0:00:12.630 ****** 30575 1726867577.25338: entering _queue_task() for managed_node3/include_tasks 30575 1726867577.25554: worker is 1 (out of 1 available) 30575 1726867577.25568: exiting _queue_task() for managed_node3/include_tasks 30575 1726867577.25583: done queuing things up, now waiting for results queue to drain 30575 1726867577.25584: waiting for pending results... 30575 1726867577.25757: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30575 1726867577.25836: in run() - task 0affcac9-a3a5-e081-a588-000000000383 30575 1726867577.25849: variable 'ansible_search_path' from source: unknown 30575 1726867577.25852: variable 'ansible_search_path' from source: unknown 30575 1726867577.25882: calling self._execute() 30575 1726867577.25946: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.25950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.25959: variable 'omit' from source: magic vars 30575 1726867577.26221: variable 'ansible_distribution_major_version' from source: facts 30575 1726867577.26233: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867577.26240: _execute() done 30575 1726867577.26243: dumping result to json 30575 1726867577.26246: done dumping result, returning 30575 1726867577.26251: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-e081-a588-000000000383] 30575 1726867577.26261: sending task result for task 0affcac9-a3a5-e081-a588-000000000383 30575 1726867577.26337: done sending task result for task 0affcac9-a3a5-e081-a588-000000000383 30575 1726867577.26339: WORKER PROCESS EXITING 30575 1726867577.26383: no more pending results, returning what we have 30575 1726867577.26388: in VariableManager get_vars() 30575 1726867577.26423: Calling all_inventory to load vars for managed_node3 30575 1726867577.26426: Calling groups_inventory to load vars for managed_node3 30575 1726867577.26429: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.26441: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.26443: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.26446: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.27729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.29376: done with get_vars() 30575 1726867577.29396: variable 'ansible_search_path' from source: unknown 30575 1726867577.29397: variable 'ansible_search_path' from source: unknown 30575 1726867577.29405: variable 'item' from source: include params 30575 1726867577.29530: variable 'item' from source: include params 30575 1726867577.29563: we have included files to process 30575 1726867577.29564: generating all_blocks data 30575 1726867577.29566: done generating all_blocks data 30575 1726867577.29567: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867577.29568: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867577.29570: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867577.30692: done processing included file 30575 1726867577.30694: iterating over new_blocks loaded from include file 30575 1726867577.30696: in VariableManager get_vars() 30575 1726867577.30710: done with get_vars() 30575 1726867577.30712: filtering new block on tags 30575 1726867577.30841: done filtering new block on tags 30575 1726867577.30844: in VariableManager get_vars() 30575 1726867577.30858: done with get_vars() 30575 1726867577.30860: filtering new block on tags 30575 1726867577.30915: done filtering new block on tags 30575 1726867577.30918: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30575 1726867577.30923: extending task lists for all hosts with included blocks 30575 1726867577.31208: done extending task lists 30575 1726867577.31209: done processing included files 30575 1726867577.31210: results queue empty 30575 1726867577.31211: checking for any_errors_fatal 30575 1726867577.31214: done checking for any_errors_fatal 30575 1726867577.31215: checking for max_fail_percentage 30575 1726867577.31216: done checking for max_fail_percentage 30575 1726867577.31216: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.31217: done checking to see if all hosts have failed 30575 1726867577.31218: getting the remaining hosts for this loop 30575 1726867577.31219: done getting the remaining hosts for this loop 30575 1726867577.31222: getting the next task for host managed_node3 30575 1726867577.31227: done getting next task for host managed_node3 30575 1726867577.31229: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867577.31232: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.31234: getting variables 30575 1726867577.31235: in VariableManager get_vars() 30575 1726867577.31244: Calling all_inventory to load vars for managed_node3 30575 1726867577.31246: Calling groups_inventory to load vars for managed_node3 30575 1726867577.31248: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.31255: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.31257: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.31259: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.36538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.38170: done with get_vars() 30575 1726867577.38191: done getting variables 30575 1726867577.38234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:26:17 -0400 (0:00:00.129) 0:00:12.760 ****** 30575 1726867577.38262: entering _queue_task() for managed_node3/set_fact 30575 1726867577.38638: worker is 1 (out of 1 available) 30575 1726867577.38648: exiting _queue_task() for managed_node3/set_fact 30575 1726867577.38660: done queuing things up, now waiting for results queue to drain 30575 1726867577.38662: waiting for pending results... 30575 1726867577.39013: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867577.39089: in run() - task 0affcac9-a3a5-e081-a588-0000000003fe 30575 1726867577.39127: variable 'ansible_search_path' from source: unknown 30575 1726867577.39136: variable 'ansible_search_path' from source: unknown 30575 1726867577.39175: calling self._execute() 30575 1726867577.39276: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.39291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.39326: variable 'omit' from source: magic vars 30575 1726867577.39709: variable 'ansible_distribution_major_version' from source: facts 30575 1726867577.39764: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867577.39768: variable 'omit' from source: magic vars 30575 1726867577.39817: variable 'omit' from source: magic vars 30575 1726867577.39856: variable 'omit' from source: magic vars 30575 1726867577.39975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867577.39981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867577.39983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867577.40000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.40018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.40052: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867577.40059: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.40066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.40180: Set connection var ansible_pipelining to False 30575 1726867577.40215: Set connection var ansible_shell_type to sh 30575 1726867577.40220: Set connection var ansible_shell_executable to /bin/sh 30575 1726867577.40222: Set connection var ansible_timeout to 10 30575 1726867577.40324: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867577.40328: Set connection var ansible_connection to ssh 30575 1726867577.40330: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.40333: variable 'ansible_connection' from source: unknown 30575 1726867577.40335: variable 'ansible_module_compression' from source: unknown 30575 1726867577.40337: variable 'ansible_shell_type' from source: unknown 30575 1726867577.40340: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.40341: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.40344: variable 'ansible_pipelining' from source: unknown 30575 1726867577.40346: variable 'ansible_timeout' from source: unknown 30575 1726867577.40348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.40467: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867577.40478: variable 'omit' from source: magic vars 30575 1726867577.40484: starting attempt loop 30575 1726867577.40487: running the handler 30575 1726867577.40497: handler run complete 30575 1726867577.40505: attempt loop complete, returning result 30575 1726867577.40507: _execute() done 30575 1726867577.40510: dumping result to json 30575 1726867577.40512: done dumping result, returning 30575 1726867577.40519: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-e081-a588-0000000003fe] 30575 1726867577.40527: sending task result for task 0affcac9-a3a5-e081-a588-0000000003fe 30575 1726867577.40611: done sending task result for task 0affcac9-a3a5-e081-a588-0000000003fe 30575 1726867577.40613: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30575 1726867577.40689: no more pending results, returning what we have 30575 1726867577.40692: results queue empty 30575 1726867577.40693: checking for any_errors_fatal 30575 1726867577.40694: done checking for any_errors_fatal 30575 1726867577.40694: checking for max_fail_percentage 30575 1726867577.40696: done checking for max_fail_percentage 30575 1726867577.40696: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.40697: done checking to see if all hosts have failed 30575 1726867577.40698: getting the remaining hosts for this loop 30575 1726867577.40699: done getting the remaining hosts for this loop 30575 1726867577.40703: getting the next task for host managed_node3 30575 1726867577.40710: done getting next task for host managed_node3 30575 1726867577.40714: ^ task is: TASK: Stat profile file 30575 1726867577.40718: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.40722: getting variables 30575 1726867577.40724: in VariableManager get_vars() 30575 1726867577.40754: Calling all_inventory to load vars for managed_node3 30575 1726867577.40757: Calling groups_inventory to load vars for managed_node3 30575 1726867577.40760: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.40769: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.40772: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.40774: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.41515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.42582: done with get_vars() 30575 1726867577.42599: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:26:17 -0400 (0:00:00.044) 0:00:12.804 ****** 30575 1726867577.42686: entering _queue_task() for managed_node3/stat 30575 1726867577.42932: worker is 1 (out of 1 available) 30575 1726867577.42947: exiting _queue_task() for managed_node3/stat 30575 1726867577.42960: done queuing things up, now waiting for results queue to drain 30575 1726867577.42961: waiting for pending results... 30575 1726867577.43303: running TaskExecutor() for managed_node3/TASK: Stat profile file 30575 1726867577.43340: in run() - task 0affcac9-a3a5-e081-a588-0000000003ff 30575 1726867577.43360: variable 'ansible_search_path' from source: unknown 30575 1726867577.43368: variable 'ansible_search_path' from source: unknown 30575 1726867577.43412: calling self._execute() 30575 1726867577.43500: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.43517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.43582: variable 'omit' from source: magic vars 30575 1726867577.43896: variable 'ansible_distribution_major_version' from source: facts 30575 1726867577.43921: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867577.43928: variable 'omit' from source: magic vars 30575 1726867577.43961: variable 'omit' from source: magic vars 30575 1726867577.44029: variable 'profile' from source: play vars 30575 1726867577.44038: variable 'interface' from source: play vars 30575 1726867577.44092: variable 'interface' from source: play vars 30575 1726867577.44106: variable 'omit' from source: magic vars 30575 1726867577.44137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867577.44166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867577.44181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867577.44195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.44206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.44229: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867577.44232: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.44234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.44308: Set connection var ansible_pipelining to False 30575 1726867577.44311: Set connection var ansible_shell_type to sh 30575 1726867577.44316: Set connection var ansible_shell_executable to /bin/sh 30575 1726867577.44321: Set connection var ansible_timeout to 10 30575 1726867577.44327: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867577.44333: Set connection var ansible_connection to ssh 30575 1726867577.44350: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.44353: variable 'ansible_connection' from source: unknown 30575 1726867577.44355: variable 'ansible_module_compression' from source: unknown 30575 1726867577.44357: variable 'ansible_shell_type' from source: unknown 30575 1726867577.44359: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.44362: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.44366: variable 'ansible_pipelining' from source: unknown 30575 1726867577.44369: variable 'ansible_timeout' from source: unknown 30575 1726867577.44371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.44512: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867577.44521: variable 'omit' from source: magic vars 30575 1726867577.44527: starting attempt loop 30575 1726867577.44530: running the handler 30575 1726867577.44541: _low_level_execute_command(): starting 30575 1726867577.44549: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867577.45043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.45047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.45050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867577.45053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.45099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867577.45106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.45108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.45159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.46834: stdout chunk (state=3): >>>/root <<< 30575 1726867577.46950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.46953: stdout chunk (state=3): >>><<< 30575 1726867577.46961: stderr chunk (state=3): >>><<< 30575 1726867577.46984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867577.46995: _low_level_execute_command(): starting 30575 1726867577.46998: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602 `" && echo ansible-tmp-1726867577.4697912-31140-7173459983602="` echo /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602 `" ) && sleep 0' 30575 1726867577.47411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.47415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867577.47424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867577.47427: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867577.47429: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.47473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867577.47476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.47526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.49426: stdout chunk (state=3): >>>ansible-tmp-1726867577.4697912-31140-7173459983602=/root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602 <<< 30575 1726867577.49533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.49552: stderr chunk (state=3): >>><<< 30575 1726867577.49556: stdout chunk (state=3): >>><<< 30575 1726867577.49569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867577.4697912-31140-7173459983602=/root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867577.49606: variable 'ansible_module_compression' from source: unknown 30575 1726867577.49651: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867577.49685: variable 'ansible_facts' from source: unknown 30575 1726867577.49739: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/AnsiballZ_stat.py 30575 1726867577.49829: Sending initial data 30575 1726867577.49833: Sent initial data (151 bytes) 30575 1726867577.50239: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.50243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.50246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867577.50250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.50297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.50301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.50350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.51889: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867577.51936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867577.51983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpr6x81nir /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/AnsiballZ_stat.py <<< 30575 1726867577.51987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/AnsiballZ_stat.py" <<< 30575 1726867577.52021: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpr6x81nir" to remote "/root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/AnsiballZ_stat.py" <<< 30575 1726867577.52555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.52587: stderr chunk (state=3): >>><<< 30575 1726867577.52590: stdout chunk (state=3): >>><<< 30575 1726867577.52621: done transferring module to remote 30575 1726867577.52632: _low_level_execute_command(): starting 30575 1726867577.52639: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/ /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/AnsiballZ_stat.py && sleep 0' 30575 1726867577.53036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.53039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.53042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867577.53044: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.53050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.53094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867577.53098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.53146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.54889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.54910: stderr chunk (state=3): >>><<< 30575 1726867577.54913: stdout chunk (state=3): >>><<< 30575 1726867577.54926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867577.54930: _low_level_execute_command(): starting 30575 1726867577.54932: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/AnsiballZ_stat.py && sleep 0' 30575 1726867577.55327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.55331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.55333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867577.55335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867577.55337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.55385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.55392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.55439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.70644: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867577.71894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867577.71914: stderr chunk (state=3): >>><<< 30575 1726867577.71917: stdout chunk (state=3): >>><<< 30575 1726867577.71933: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867577.72007: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867577.72011: _low_level_execute_command(): starting 30575 1726867577.72013: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867577.4697912-31140-7173459983602/ > /dev/null 2>&1 && sleep 0' 30575 1726867577.72583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867577.72586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867577.72589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.72591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867577.72598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867577.72606: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867577.72629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.72632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867577.72635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867577.72642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867577.72686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867577.72689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.72692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867577.72694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867577.72697: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867577.72700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.72761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867577.72772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.72814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.72861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.74742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.74983: stderr chunk (state=3): >>><<< 30575 1726867577.74987: stdout chunk (state=3): >>><<< 30575 1726867577.74990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867577.74993: handler run complete 30575 1726867577.74996: attempt loop complete, returning result 30575 1726867577.74998: _execute() done 30575 1726867577.75000: dumping result to json 30575 1726867577.75003: done dumping result, returning 30575 1726867577.75005: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-e081-a588-0000000003ff] 30575 1726867577.75008: sending task result for task 0affcac9-a3a5-e081-a588-0000000003ff 30575 1726867577.75073: done sending task result for task 0affcac9-a3a5-e081-a588-0000000003ff 30575 1726867577.75076: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867577.75147: no more pending results, returning what we have 30575 1726867577.75151: results queue empty 30575 1726867577.75152: checking for any_errors_fatal 30575 1726867577.75157: done checking for any_errors_fatal 30575 1726867577.75158: checking for max_fail_percentage 30575 1726867577.75160: done checking for max_fail_percentage 30575 1726867577.75161: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.75162: done checking to see if all hosts have failed 30575 1726867577.75163: getting the remaining hosts for this loop 30575 1726867577.75164: done getting the remaining hosts for this loop 30575 1726867577.75169: getting the next task for host managed_node3 30575 1726867577.75181: done getting next task for host managed_node3 30575 1726867577.75184: ^ task is: TASK: Set NM profile exist flag based on the profile files 30575 1726867577.75189: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.75193: getting variables 30575 1726867577.75195: in VariableManager get_vars() 30575 1726867577.75232: Calling all_inventory to load vars for managed_node3 30575 1726867577.75235: Calling groups_inventory to load vars for managed_node3 30575 1726867577.75238: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.75250: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.75253: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.75256: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.77085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.80358: done with get_vars() 30575 1726867577.80432: done getting variables 30575 1726867577.80496: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:26:17 -0400 (0:00:00.378) 0:00:13.183 ****** 30575 1726867577.80581: entering _queue_task() for managed_node3/set_fact 30575 1726867577.81237: worker is 1 (out of 1 available) 30575 1726867577.81249: exiting _queue_task() for managed_node3/set_fact 30575 1726867577.81334: done queuing things up, now waiting for results queue to drain 30575 1726867577.81337: waiting for pending results... 30575 1726867577.81996: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30575 1726867577.82007: in run() - task 0affcac9-a3a5-e081-a588-000000000400 30575 1726867577.82012: variable 'ansible_search_path' from source: unknown 30575 1726867577.82015: variable 'ansible_search_path' from source: unknown 30575 1726867577.82220: calling self._execute() 30575 1726867577.82318: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.82554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.82559: variable 'omit' from source: magic vars 30575 1726867577.83260: variable 'ansible_distribution_major_version' from source: facts 30575 1726867577.83582: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867577.83626: variable 'profile_stat' from source: set_fact 30575 1726867577.83644: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867577.83653: when evaluation is False, skipping this task 30575 1726867577.83663: _execute() done 30575 1726867577.83673: dumping result to json 30575 1726867577.83984: done dumping result, returning 30575 1726867577.83988: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-e081-a588-000000000400] 30575 1726867577.83992: sending task result for task 0affcac9-a3a5-e081-a588-000000000400 30575 1726867577.84065: done sending task result for task 0affcac9-a3a5-e081-a588-000000000400 30575 1726867577.84070: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867577.84126: no more pending results, returning what we have 30575 1726867577.84130: results queue empty 30575 1726867577.84131: checking for any_errors_fatal 30575 1726867577.84140: done checking for any_errors_fatal 30575 1726867577.84141: checking for max_fail_percentage 30575 1726867577.84143: done checking for max_fail_percentage 30575 1726867577.84143: checking to see if all hosts have failed and the running result is not ok 30575 1726867577.84144: done checking to see if all hosts have failed 30575 1726867577.84145: getting the remaining hosts for this loop 30575 1726867577.84146: done getting the remaining hosts for this loop 30575 1726867577.84150: getting the next task for host managed_node3 30575 1726867577.84158: done getting next task for host managed_node3 30575 1726867577.84160: ^ task is: TASK: Get NM profile info 30575 1726867577.84165: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867577.84169: getting variables 30575 1726867577.84170: in VariableManager get_vars() 30575 1726867577.84204: Calling all_inventory to load vars for managed_node3 30575 1726867577.84207: Calling groups_inventory to load vars for managed_node3 30575 1726867577.84332: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867577.84345: Calling all_plugins_play to load vars for managed_node3 30575 1726867577.84348: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867577.84352: Calling groups_plugins_play to load vars for managed_node3 30575 1726867577.86216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867577.88097: done with get_vars() 30575 1726867577.88117: done getting variables 30575 1726867577.88227: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:26:17 -0400 (0:00:00.076) 0:00:13.260 ****** 30575 1726867577.88268: entering _queue_task() for managed_node3/shell 30575 1726867577.88270: Creating lock for shell 30575 1726867577.88648: worker is 1 (out of 1 available) 30575 1726867577.88660: exiting _queue_task() for managed_node3/shell 30575 1726867577.88671: done queuing things up, now waiting for results queue to drain 30575 1726867577.88672: waiting for pending results... 30575 1726867577.89003: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30575 1726867577.89137: in run() - task 0affcac9-a3a5-e081-a588-000000000401 30575 1726867577.89158: variable 'ansible_search_path' from source: unknown 30575 1726867577.89165: variable 'ansible_search_path' from source: unknown 30575 1726867577.89208: calling self._execute() 30575 1726867577.89298: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.89310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.89326: variable 'omit' from source: magic vars 30575 1726867577.89683: variable 'ansible_distribution_major_version' from source: facts 30575 1726867577.89701: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867577.89714: variable 'omit' from source: magic vars 30575 1726867577.89770: variable 'omit' from source: magic vars 30575 1726867577.89854: variable 'profile' from source: play vars 30575 1726867577.89859: variable 'interface' from source: play vars 30575 1726867577.89911: variable 'interface' from source: play vars 30575 1726867577.89929: variable 'omit' from source: magic vars 30575 1726867577.89957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867577.89987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867577.90002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867577.90015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.90028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867577.90050: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867577.90053: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.90055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.90136: Set connection var ansible_pipelining to False 30575 1726867577.90142: Set connection var ansible_shell_type to sh 30575 1726867577.90149: Set connection var ansible_shell_executable to /bin/sh 30575 1726867577.90154: Set connection var ansible_timeout to 10 30575 1726867577.90159: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867577.90165: Set connection var ansible_connection to ssh 30575 1726867577.90187: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.90190: variable 'ansible_connection' from source: unknown 30575 1726867577.90193: variable 'ansible_module_compression' from source: unknown 30575 1726867577.90195: variable 'ansible_shell_type' from source: unknown 30575 1726867577.90197: variable 'ansible_shell_executable' from source: unknown 30575 1726867577.90199: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867577.90202: variable 'ansible_pipelining' from source: unknown 30575 1726867577.90205: variable 'ansible_timeout' from source: unknown 30575 1726867577.90208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867577.90305: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867577.90314: variable 'omit' from source: magic vars 30575 1726867577.90329: starting attempt loop 30575 1726867577.90333: running the handler 30575 1726867577.90336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867577.90348: _low_level_execute_command(): starting 30575 1726867577.90355: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867577.90844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.90848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867577.90851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.90896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.90909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.90963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.92666: stdout chunk (state=3): >>>/root <<< 30575 1726867577.92749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.92786: stderr chunk (state=3): >>><<< 30575 1726867577.92790: stdout chunk (state=3): >>><<< 30575 1726867577.92813: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867577.92826: _low_level_execute_command(): starting 30575 1726867577.92835: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378 `" && echo ansible-tmp-1726867577.928126-31162-216062449170378="` echo /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378 `" ) && sleep 0' 30575 1726867577.93585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867577.93597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867577.93599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.93602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867577.93605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867577.93607: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867577.93611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.93613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.93687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.95554: stdout chunk (state=3): >>>ansible-tmp-1726867577.928126-31162-216062449170378=/root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378 <<< 30575 1726867577.95666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.95690: stderr chunk (state=3): >>><<< 30575 1726867577.95694: stdout chunk (state=3): >>><<< 30575 1726867577.95708: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867577.928126-31162-216062449170378=/root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867577.95733: variable 'ansible_module_compression' from source: unknown 30575 1726867577.95775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867577.95806: variable 'ansible_facts' from source: unknown 30575 1726867577.95853: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/AnsiballZ_command.py 30575 1726867577.95951: Sending initial data 30575 1726867577.95954: Sent initial data (155 bytes) 30575 1726867577.96596: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.96635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867577.96652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.96665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.96756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867577.98276: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867577.98286: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867577.98321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867577.98366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmptl1vc_jx /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/AnsiballZ_command.py <<< 30575 1726867577.98374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/AnsiballZ_command.py" <<< 30575 1726867577.98411: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmptl1vc_jx" to remote "/root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/AnsiballZ_command.py" <<< 30575 1726867577.98414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/AnsiballZ_command.py" <<< 30575 1726867577.98944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867577.98975: stderr chunk (state=3): >>><<< 30575 1726867577.98981: stdout chunk (state=3): >>><<< 30575 1726867577.99006: done transferring module to remote 30575 1726867577.99014: _low_level_execute_command(): starting 30575 1726867577.99019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/ /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/AnsiballZ_command.py && sleep 0' 30575 1726867577.99444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867577.99448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.99453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867577.99456: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867577.99458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867577.99495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867577.99498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867577.99549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867578.01258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867578.01286: stderr chunk (state=3): >>><<< 30575 1726867578.01290: stdout chunk (state=3): >>><<< 30575 1726867578.01302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867578.01305: _low_level_execute_command(): starting 30575 1726867578.01309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/AnsiballZ_command.py && sleep 0' 30575 1726867578.01735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867578.01738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867578.01740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867578.01742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867578.01789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867578.01795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867578.01797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867578.01845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867578.18589: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:26:18.166932", "end": "2024-09-20 17:26:18.183629", "delta": "0:00:00.016697", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867578.20050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867578.20082: stderr chunk (state=3): >>><<< 30575 1726867578.20085: stdout chunk (state=3): >>><<< 30575 1726867578.20104: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:26:18.166932", "end": "2024-09-20 17:26:18.183629", "delta": "0:00:00.016697", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867578.20134: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867578.20145: _low_level_execute_command(): starting 30575 1726867578.20153: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867577.928126-31162-216062449170378/ > /dev/null 2>&1 && sleep 0' 30575 1726867578.20617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867578.20620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867578.20622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867578.20625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867578.20627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867578.20682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867578.20686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867578.20690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867578.20735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867578.22515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867578.22545: stderr chunk (state=3): >>><<< 30575 1726867578.22548: stdout chunk (state=3): >>><<< 30575 1726867578.22559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867578.22566: handler run complete 30575 1726867578.22588: Evaluated conditional (False): False 30575 1726867578.22596: attempt loop complete, returning result 30575 1726867578.22599: _execute() done 30575 1726867578.22601: dumping result to json 30575 1726867578.22606: done dumping result, returning 30575 1726867578.22613: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-e081-a588-000000000401] 30575 1726867578.22617: sending task result for task 0affcac9-a3a5-e081-a588-000000000401 30575 1726867578.22715: done sending task result for task 0affcac9-a3a5-e081-a588-000000000401 30575 1726867578.22717: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016697", "end": "2024-09-20 17:26:18.183629", "rc": 0, "start": "2024-09-20 17:26:18.166932" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30575 1726867578.22785: no more pending results, returning what we have 30575 1726867578.22788: results queue empty 30575 1726867578.22789: checking for any_errors_fatal 30575 1726867578.22796: done checking for any_errors_fatal 30575 1726867578.22797: checking for max_fail_percentage 30575 1726867578.22798: done checking for max_fail_percentage 30575 1726867578.22799: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.22800: done checking to see if all hosts have failed 30575 1726867578.22801: getting the remaining hosts for this loop 30575 1726867578.22802: done getting the remaining hosts for this loop 30575 1726867578.22805: getting the next task for host managed_node3 30575 1726867578.22813: done getting next task for host managed_node3 30575 1726867578.22815: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867578.22820: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.22824: getting variables 30575 1726867578.22825: in VariableManager get_vars() 30575 1726867578.22858: Calling all_inventory to load vars for managed_node3 30575 1726867578.22860: Calling groups_inventory to load vars for managed_node3 30575 1726867578.22863: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.22875: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.22886: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.22890: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.23699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.24572: done with get_vars() 30575 1726867578.24590: done getting variables 30575 1726867578.24639: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:26:18 -0400 (0:00:00.364) 0:00:13.624 ****** 30575 1726867578.24663: entering _queue_task() for managed_node3/set_fact 30575 1726867578.24890: worker is 1 (out of 1 available) 30575 1726867578.24902: exiting _queue_task() for managed_node3/set_fact 30575 1726867578.24914: done queuing things up, now waiting for results queue to drain 30575 1726867578.24916: waiting for pending results... 30575 1726867578.25093: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867578.25171: in run() - task 0affcac9-a3a5-e081-a588-000000000402 30575 1726867578.25185: variable 'ansible_search_path' from source: unknown 30575 1726867578.25188: variable 'ansible_search_path' from source: unknown 30575 1726867578.25217: calling self._execute() 30575 1726867578.25290: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.25294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.25303: variable 'omit' from source: magic vars 30575 1726867578.25579: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.25592: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.25680: variable 'nm_profile_exists' from source: set_fact 30575 1726867578.25692: Evaluated conditional (nm_profile_exists.rc == 0): True 30575 1726867578.25700: variable 'omit' from source: magic vars 30575 1726867578.25740: variable 'omit' from source: magic vars 30575 1726867578.25762: variable 'omit' from source: magic vars 30575 1726867578.25796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867578.25827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867578.25840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867578.25853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.25864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.25889: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867578.25891: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.25895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.25967: Set connection var ansible_pipelining to False 30575 1726867578.25970: Set connection var ansible_shell_type to sh 30575 1726867578.25975: Set connection var ansible_shell_executable to /bin/sh 30575 1726867578.25982: Set connection var ansible_timeout to 10 30575 1726867578.25987: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867578.25993: Set connection var ansible_connection to ssh 30575 1726867578.26011: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.26014: variable 'ansible_connection' from source: unknown 30575 1726867578.26016: variable 'ansible_module_compression' from source: unknown 30575 1726867578.26018: variable 'ansible_shell_type' from source: unknown 30575 1726867578.26022: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.26028: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.26031: variable 'ansible_pipelining' from source: unknown 30575 1726867578.26033: variable 'ansible_timeout' from source: unknown 30575 1726867578.26035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.26135: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867578.26153: variable 'omit' from source: magic vars 30575 1726867578.26156: starting attempt loop 30575 1726867578.26159: running the handler 30575 1726867578.26165: handler run complete 30575 1726867578.26175: attempt loop complete, returning result 30575 1726867578.26179: _execute() done 30575 1726867578.26182: dumping result to json 30575 1726867578.26184: done dumping result, returning 30575 1726867578.26192: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-e081-a588-000000000402] 30575 1726867578.26197: sending task result for task 0affcac9-a3a5-e081-a588-000000000402 30575 1726867578.26284: done sending task result for task 0affcac9-a3a5-e081-a588-000000000402 30575 1726867578.26286: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30575 1726867578.26340: no more pending results, returning what we have 30575 1726867578.26343: results queue empty 30575 1726867578.26344: checking for any_errors_fatal 30575 1726867578.26353: done checking for any_errors_fatal 30575 1726867578.26354: checking for max_fail_percentage 30575 1726867578.26356: done checking for max_fail_percentage 30575 1726867578.26356: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.26357: done checking to see if all hosts have failed 30575 1726867578.26358: getting the remaining hosts for this loop 30575 1726867578.26359: done getting the remaining hosts for this loop 30575 1726867578.26363: getting the next task for host managed_node3 30575 1726867578.26374: done getting next task for host managed_node3 30575 1726867578.26376: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867578.26383: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.26387: getting variables 30575 1726867578.26388: in VariableManager get_vars() 30575 1726867578.26419: Calling all_inventory to load vars for managed_node3 30575 1726867578.26421: Calling groups_inventory to load vars for managed_node3 30575 1726867578.26427: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.26437: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.26439: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.26442: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.27383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.28238: done with get_vars() 30575 1726867578.28252: done getting variables 30575 1726867578.28296: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867578.28387: variable 'profile' from source: play vars 30575 1726867578.28391: variable 'interface' from source: play vars 30575 1726867578.28435: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:26:18 -0400 (0:00:00.037) 0:00:13.662 ****** 30575 1726867578.28460: entering _queue_task() for managed_node3/command 30575 1726867578.28696: worker is 1 (out of 1 available) 30575 1726867578.28710: exiting _queue_task() for managed_node3/command 30575 1726867578.28723: done queuing things up, now waiting for results queue to drain 30575 1726867578.28724: waiting for pending results... 30575 1726867578.28899: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30575 1726867578.28989: in run() - task 0affcac9-a3a5-e081-a588-000000000404 30575 1726867578.28999: variable 'ansible_search_path' from source: unknown 30575 1726867578.29002: variable 'ansible_search_path' from source: unknown 30575 1726867578.29034: calling self._execute() 30575 1726867578.29100: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.29104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.29112: variable 'omit' from source: magic vars 30575 1726867578.29368: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.29379: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.29463: variable 'profile_stat' from source: set_fact 30575 1726867578.29473: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867578.29476: when evaluation is False, skipping this task 30575 1726867578.29480: _execute() done 30575 1726867578.29483: dumping result to json 30575 1726867578.29488: done dumping result, returning 30575 1726867578.29499: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000404] 30575 1726867578.29502: sending task result for task 0affcac9-a3a5-e081-a588-000000000404 30575 1726867578.29579: done sending task result for task 0affcac9-a3a5-e081-a588-000000000404 30575 1726867578.29582: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867578.29647: no more pending results, returning what we have 30575 1726867578.29651: results queue empty 30575 1726867578.29652: checking for any_errors_fatal 30575 1726867578.29658: done checking for any_errors_fatal 30575 1726867578.29659: checking for max_fail_percentage 30575 1726867578.29660: done checking for max_fail_percentage 30575 1726867578.29661: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.29662: done checking to see if all hosts have failed 30575 1726867578.29663: getting the remaining hosts for this loop 30575 1726867578.29664: done getting the remaining hosts for this loop 30575 1726867578.29667: getting the next task for host managed_node3 30575 1726867578.29675: done getting next task for host managed_node3 30575 1726867578.29680: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867578.29684: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.29687: getting variables 30575 1726867578.29689: in VariableManager get_vars() 30575 1726867578.29715: Calling all_inventory to load vars for managed_node3 30575 1726867578.29717: Calling groups_inventory to load vars for managed_node3 30575 1726867578.29720: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.29729: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.29731: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.29733: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.30494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.31366: done with get_vars() 30575 1726867578.31382: done getting variables 30575 1726867578.31429: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867578.31504: variable 'profile' from source: play vars 30575 1726867578.31507: variable 'interface' from source: play vars 30575 1726867578.31552: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:26:18 -0400 (0:00:00.031) 0:00:13.693 ****** 30575 1726867578.31574: entering _queue_task() for managed_node3/set_fact 30575 1726867578.31802: worker is 1 (out of 1 available) 30575 1726867578.31815: exiting _queue_task() for managed_node3/set_fact 30575 1726867578.31830: done queuing things up, now waiting for results queue to drain 30575 1726867578.31832: waiting for pending results... 30575 1726867578.32003: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30575 1726867578.32092: in run() - task 0affcac9-a3a5-e081-a588-000000000405 30575 1726867578.32104: variable 'ansible_search_path' from source: unknown 30575 1726867578.32107: variable 'ansible_search_path' from source: unknown 30575 1726867578.32136: calling self._execute() 30575 1726867578.32203: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.32206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.32215: variable 'omit' from source: magic vars 30575 1726867578.32466: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.32476: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.32563: variable 'profile_stat' from source: set_fact 30575 1726867578.32572: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867578.32575: when evaluation is False, skipping this task 30575 1726867578.32580: _execute() done 30575 1726867578.32583: dumping result to json 30575 1726867578.32587: done dumping result, returning 30575 1726867578.32594: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000405] 30575 1726867578.32603: sending task result for task 0affcac9-a3a5-e081-a588-000000000405 30575 1726867578.32684: done sending task result for task 0affcac9-a3a5-e081-a588-000000000405 30575 1726867578.32687: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867578.32757: no more pending results, returning what we have 30575 1726867578.32760: results queue empty 30575 1726867578.32761: checking for any_errors_fatal 30575 1726867578.32765: done checking for any_errors_fatal 30575 1726867578.32766: checking for max_fail_percentage 30575 1726867578.32768: done checking for max_fail_percentage 30575 1726867578.32769: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.32770: done checking to see if all hosts have failed 30575 1726867578.32770: getting the remaining hosts for this loop 30575 1726867578.32772: done getting the remaining hosts for this loop 30575 1726867578.32775: getting the next task for host managed_node3 30575 1726867578.32783: done getting next task for host managed_node3 30575 1726867578.32786: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30575 1726867578.32790: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.32794: getting variables 30575 1726867578.32795: in VariableManager get_vars() 30575 1726867578.32822: Calling all_inventory to load vars for managed_node3 30575 1726867578.32826: Calling groups_inventory to load vars for managed_node3 30575 1726867578.32829: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.32838: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.32840: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.32842: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.33706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.34552: done with get_vars() 30575 1726867578.34566: done getting variables 30575 1726867578.34607: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867578.34681: variable 'profile' from source: play vars 30575 1726867578.34684: variable 'interface' from source: play vars 30575 1726867578.34721: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:26:18 -0400 (0:00:00.031) 0:00:13.725 ****** 30575 1726867578.34746: entering _queue_task() for managed_node3/command 30575 1726867578.34949: worker is 1 (out of 1 available) 30575 1726867578.34962: exiting _queue_task() for managed_node3/command 30575 1726867578.34974: done queuing things up, now waiting for results queue to drain 30575 1726867578.34976: waiting for pending results... 30575 1726867578.35149: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30575 1726867578.35249: in run() - task 0affcac9-a3a5-e081-a588-000000000406 30575 1726867578.35254: variable 'ansible_search_path' from source: unknown 30575 1726867578.35256: variable 'ansible_search_path' from source: unknown 30575 1726867578.35288: calling self._execute() 30575 1726867578.35358: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.35362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.35370: variable 'omit' from source: magic vars 30575 1726867578.35652: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.35662: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.35744: variable 'profile_stat' from source: set_fact 30575 1726867578.35753: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867578.35756: when evaluation is False, skipping this task 30575 1726867578.35759: _execute() done 30575 1726867578.35762: dumping result to json 30575 1726867578.35764: done dumping result, returning 30575 1726867578.35774: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000406] 30575 1726867578.35778: sending task result for task 0affcac9-a3a5-e081-a588-000000000406 30575 1726867578.35858: done sending task result for task 0affcac9-a3a5-e081-a588-000000000406 30575 1726867578.35861: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867578.35921: no more pending results, returning what we have 30575 1726867578.35927: results queue empty 30575 1726867578.35928: checking for any_errors_fatal 30575 1726867578.35932: done checking for any_errors_fatal 30575 1726867578.35933: checking for max_fail_percentage 30575 1726867578.35935: done checking for max_fail_percentage 30575 1726867578.35936: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.35936: done checking to see if all hosts have failed 30575 1726867578.35937: getting the remaining hosts for this loop 30575 1726867578.35938: done getting the remaining hosts for this loop 30575 1726867578.35941: getting the next task for host managed_node3 30575 1726867578.35948: done getting next task for host managed_node3 30575 1726867578.35950: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30575 1726867578.35954: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.35957: getting variables 30575 1726867578.35958: in VariableManager get_vars() 30575 1726867578.35986: Calling all_inventory to load vars for managed_node3 30575 1726867578.35989: Calling groups_inventory to load vars for managed_node3 30575 1726867578.35991: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.36000: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.36002: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.36004: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.37192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.38601: done with get_vars() 30575 1726867578.38629: done getting variables 30575 1726867578.38712: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867578.38845: variable 'profile' from source: play vars 30575 1726867578.38849: variable 'interface' from source: play vars 30575 1726867578.38926: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:26:18 -0400 (0:00:00.042) 0:00:13.767 ****** 30575 1726867578.38968: entering _queue_task() for managed_node3/set_fact 30575 1726867578.39250: worker is 1 (out of 1 available) 30575 1726867578.39265: exiting _queue_task() for managed_node3/set_fact 30575 1726867578.39279: done queuing things up, now waiting for results queue to drain 30575 1726867578.39281: waiting for pending results... 30575 1726867578.39598: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30575 1726867578.39675: in run() - task 0affcac9-a3a5-e081-a588-000000000407 30575 1726867578.39692: variable 'ansible_search_path' from source: unknown 30575 1726867578.39699: variable 'ansible_search_path' from source: unknown 30575 1726867578.39756: calling self._execute() 30575 1726867578.39917: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.39920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.39925: variable 'omit' from source: magic vars 30575 1726867578.40253: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.40259: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.40340: variable 'profile_stat' from source: set_fact 30575 1726867578.40349: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867578.40352: when evaluation is False, skipping this task 30575 1726867578.40354: _execute() done 30575 1726867578.40359: dumping result to json 30575 1726867578.40361: done dumping result, returning 30575 1726867578.40372: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000407] 30575 1726867578.40375: sending task result for task 0affcac9-a3a5-e081-a588-000000000407 30575 1726867578.40455: done sending task result for task 0affcac9-a3a5-e081-a588-000000000407 30575 1726867578.40458: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867578.40517: no more pending results, returning what we have 30575 1726867578.40521: results queue empty 30575 1726867578.40522: checking for any_errors_fatal 30575 1726867578.40530: done checking for any_errors_fatal 30575 1726867578.40531: checking for max_fail_percentage 30575 1726867578.40533: done checking for max_fail_percentage 30575 1726867578.40533: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.40534: done checking to see if all hosts have failed 30575 1726867578.40535: getting the remaining hosts for this loop 30575 1726867578.40536: done getting the remaining hosts for this loop 30575 1726867578.40540: getting the next task for host managed_node3 30575 1726867578.40549: done getting next task for host managed_node3 30575 1726867578.40551: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30575 1726867578.40555: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.40558: getting variables 30575 1726867578.40560: in VariableManager get_vars() 30575 1726867578.40590: Calling all_inventory to load vars for managed_node3 30575 1726867578.40593: Calling groups_inventory to load vars for managed_node3 30575 1726867578.40595: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.40604: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.40606: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.40609: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.41976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.43421: done with get_vars() 30575 1726867578.43438: done getting variables 30575 1726867578.43482: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867578.43566: variable 'profile' from source: play vars 30575 1726867578.43569: variable 'interface' from source: play vars 30575 1726867578.43612: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:26:18 -0400 (0:00:00.046) 0:00:13.813 ****** 30575 1726867578.43639: entering _queue_task() for managed_node3/assert 30575 1726867578.43869: worker is 1 (out of 1 available) 30575 1726867578.43884: exiting _queue_task() for managed_node3/assert 30575 1726867578.43896: done queuing things up, now waiting for results queue to drain 30575 1726867578.43897: waiting for pending results... 30575 1726867578.44071: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' 30575 1726867578.44144: in run() - task 0affcac9-a3a5-e081-a588-000000000384 30575 1726867578.44156: variable 'ansible_search_path' from source: unknown 30575 1726867578.44160: variable 'ansible_search_path' from source: unknown 30575 1726867578.44190: calling self._execute() 30575 1726867578.44261: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.44265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.44274: variable 'omit' from source: magic vars 30575 1726867578.44531: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.44538: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.44544: variable 'omit' from source: magic vars 30575 1726867578.44575: variable 'omit' from source: magic vars 30575 1726867578.44644: variable 'profile' from source: play vars 30575 1726867578.44647: variable 'interface' from source: play vars 30575 1726867578.44696: variable 'interface' from source: play vars 30575 1726867578.44710: variable 'omit' from source: magic vars 30575 1726867578.44742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867578.44769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867578.44790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867578.44805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.44816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.44840: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867578.44843: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.44846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.44919: Set connection var ansible_pipelining to False 30575 1726867578.44922: Set connection var ansible_shell_type to sh 30575 1726867578.44928: Set connection var ansible_shell_executable to /bin/sh 30575 1726867578.44932: Set connection var ansible_timeout to 10 30575 1726867578.44937: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867578.44943: Set connection var ansible_connection to ssh 30575 1726867578.44963: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.44966: variable 'ansible_connection' from source: unknown 30575 1726867578.44969: variable 'ansible_module_compression' from source: unknown 30575 1726867578.44971: variable 'ansible_shell_type' from source: unknown 30575 1726867578.44974: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.44976: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.44980: variable 'ansible_pipelining' from source: unknown 30575 1726867578.44983: variable 'ansible_timeout' from source: unknown 30575 1726867578.44988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.45083: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867578.45093: variable 'omit' from source: magic vars 30575 1726867578.45105: starting attempt loop 30575 1726867578.45108: running the handler 30575 1726867578.45176: variable 'lsr_net_profile_exists' from source: set_fact 30575 1726867578.45182: Evaluated conditional (lsr_net_profile_exists): True 30575 1726867578.45187: handler run complete 30575 1726867578.45199: attempt loop complete, returning result 30575 1726867578.45202: _execute() done 30575 1726867578.45206: dumping result to json 30575 1726867578.45209: done dumping result, returning 30575 1726867578.45217: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' [0affcac9-a3a5-e081-a588-000000000384] 30575 1726867578.45220: sending task result for task 0affcac9-a3a5-e081-a588-000000000384 30575 1726867578.45297: done sending task result for task 0affcac9-a3a5-e081-a588-000000000384 30575 1726867578.45300: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867578.45367: no more pending results, returning what we have 30575 1726867578.45371: results queue empty 30575 1726867578.45372: checking for any_errors_fatal 30575 1726867578.45380: done checking for any_errors_fatal 30575 1726867578.45380: checking for max_fail_percentage 30575 1726867578.45382: done checking for max_fail_percentage 30575 1726867578.45383: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.45383: done checking to see if all hosts have failed 30575 1726867578.45384: getting the remaining hosts for this loop 30575 1726867578.45386: done getting the remaining hosts for this loop 30575 1726867578.45389: getting the next task for host managed_node3 30575 1726867578.45395: done getting next task for host managed_node3 30575 1726867578.45397: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30575 1726867578.45401: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.45404: getting variables 30575 1726867578.45406: in VariableManager get_vars() 30575 1726867578.45435: Calling all_inventory to load vars for managed_node3 30575 1726867578.45437: Calling groups_inventory to load vars for managed_node3 30575 1726867578.45440: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.45449: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.45451: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.45454: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.46748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.48484: done with get_vars() 30575 1726867578.48504: done getting variables 30575 1726867578.48561: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867578.48667: variable 'profile' from source: play vars 30575 1726867578.48670: variable 'interface' from source: play vars 30575 1726867578.48728: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:26:18 -0400 (0:00:00.051) 0:00:13.865 ****** 30575 1726867578.48761: entering _queue_task() for managed_node3/assert 30575 1726867578.49041: worker is 1 (out of 1 available) 30575 1726867578.49054: exiting _queue_task() for managed_node3/assert 30575 1726867578.49066: done queuing things up, now waiting for results queue to drain 30575 1726867578.49068: waiting for pending results... 30575 1726867578.49355: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' 30575 1726867578.49473: in run() - task 0affcac9-a3a5-e081-a588-000000000385 30575 1726867578.49492: variable 'ansible_search_path' from source: unknown 30575 1726867578.49495: variable 'ansible_search_path' from source: unknown 30575 1726867578.49540: calling self._execute() 30575 1726867578.49788: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.49792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.49795: variable 'omit' from source: magic vars 30575 1726867578.50036: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.50050: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.50057: variable 'omit' from source: magic vars 30575 1726867578.50100: variable 'omit' from source: magic vars 30575 1726867578.50284: variable 'profile' from source: play vars 30575 1726867578.50287: variable 'interface' from source: play vars 30575 1726867578.50290: variable 'interface' from source: play vars 30575 1726867578.50292: variable 'omit' from source: magic vars 30575 1726867578.50335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867578.50444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867578.50447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867578.50449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.50452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.50454: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867578.50457: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.50459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.50554: Set connection var ansible_pipelining to False 30575 1726867578.50557: Set connection var ansible_shell_type to sh 30575 1726867578.50559: Set connection var ansible_shell_executable to /bin/sh 30575 1726867578.50561: Set connection var ansible_timeout to 10 30575 1726867578.50566: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867578.50573: Set connection var ansible_connection to ssh 30575 1726867578.50603: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.50606: variable 'ansible_connection' from source: unknown 30575 1726867578.50608: variable 'ansible_module_compression' from source: unknown 30575 1726867578.50611: variable 'ansible_shell_type' from source: unknown 30575 1726867578.50613: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.50615: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.50617: variable 'ansible_pipelining' from source: unknown 30575 1726867578.50620: variable 'ansible_timeout' from source: unknown 30575 1726867578.50628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.50748: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867578.50758: variable 'omit' from source: magic vars 30575 1726867578.50770: starting attempt loop 30575 1726867578.50773: running the handler 30575 1726867578.50881: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30575 1726867578.50884: Evaluated conditional (lsr_net_profile_ansible_managed): True 30575 1726867578.50887: handler run complete 30575 1726867578.50901: attempt loop complete, returning result 30575 1726867578.50904: _execute() done 30575 1726867578.50906: dumping result to json 30575 1726867578.50909: done dumping result, returning 30575 1726867578.50951: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcac9-a3a5-e081-a588-000000000385] 30575 1726867578.50955: sending task result for task 0affcac9-a3a5-e081-a588-000000000385 30575 1726867578.51126: done sending task result for task 0affcac9-a3a5-e081-a588-000000000385 30575 1726867578.51130: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867578.51170: no more pending results, returning what we have 30575 1726867578.51173: results queue empty 30575 1726867578.51174: checking for any_errors_fatal 30575 1726867578.51180: done checking for any_errors_fatal 30575 1726867578.51181: checking for max_fail_percentage 30575 1726867578.51182: done checking for max_fail_percentage 30575 1726867578.51183: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.51184: done checking to see if all hosts have failed 30575 1726867578.51185: getting the remaining hosts for this loop 30575 1726867578.51187: done getting the remaining hosts for this loop 30575 1726867578.51190: getting the next task for host managed_node3 30575 1726867578.51196: done getting next task for host managed_node3 30575 1726867578.51198: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30575 1726867578.51201: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.51205: getting variables 30575 1726867578.51207: in VariableManager get_vars() 30575 1726867578.51232: Calling all_inventory to load vars for managed_node3 30575 1726867578.51235: Calling groups_inventory to load vars for managed_node3 30575 1726867578.51238: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.51247: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.51250: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.51253: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.52604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.54090: done with get_vars() 30575 1726867578.54111: done getting variables 30575 1726867578.54168: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867578.54276: variable 'profile' from source: play vars 30575 1726867578.54282: variable 'interface' from source: play vars 30575 1726867578.54343: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:26:18 -0400 (0:00:00.056) 0:00:13.921 ****** 30575 1726867578.54373: entering _queue_task() for managed_node3/assert 30575 1726867578.54663: worker is 1 (out of 1 available) 30575 1726867578.54675: exiting _queue_task() for managed_node3/assert 30575 1726867578.54888: done queuing things up, now waiting for results queue to drain 30575 1726867578.54890: waiting for pending results... 30575 1726867578.54965: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr 30575 1726867578.55250: in run() - task 0affcac9-a3a5-e081-a588-000000000386 30575 1726867578.55254: variable 'ansible_search_path' from source: unknown 30575 1726867578.55257: variable 'ansible_search_path' from source: unknown 30575 1726867578.55260: calling self._execute() 30575 1726867578.55263: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.55265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.55268: variable 'omit' from source: magic vars 30575 1726867578.55574: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.55587: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.55591: variable 'omit' from source: magic vars 30575 1726867578.55634: variable 'omit' from source: magic vars 30575 1726867578.55732: variable 'profile' from source: play vars 30575 1726867578.55736: variable 'interface' from source: play vars 30575 1726867578.55883: variable 'interface' from source: play vars 30575 1726867578.55887: variable 'omit' from source: magic vars 30575 1726867578.55889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867578.55901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867578.55927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867578.55938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.55951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.55986: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867578.56033: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.56036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.56095: Set connection var ansible_pipelining to False 30575 1726867578.56098: Set connection var ansible_shell_type to sh 30575 1726867578.56105: Set connection var ansible_shell_executable to /bin/sh 30575 1726867578.56111: Set connection var ansible_timeout to 10 30575 1726867578.56117: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867578.56127: Set connection var ansible_connection to ssh 30575 1726867578.56149: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.56152: variable 'ansible_connection' from source: unknown 30575 1726867578.56155: variable 'ansible_module_compression' from source: unknown 30575 1726867578.56157: variable 'ansible_shell_type' from source: unknown 30575 1726867578.56248: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.56254: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.56257: variable 'ansible_pipelining' from source: unknown 30575 1726867578.56260: variable 'ansible_timeout' from source: unknown 30575 1726867578.56263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.56320: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867578.56330: variable 'omit' from source: magic vars 30575 1726867578.56337: starting attempt loop 30575 1726867578.56340: running the handler 30575 1726867578.56588: variable 'lsr_net_profile_fingerprint' from source: set_fact 30575 1726867578.56592: Evaluated conditional (lsr_net_profile_fingerprint): True 30575 1726867578.56598: handler run complete 30575 1726867578.56600: attempt loop complete, returning result 30575 1726867578.56602: _execute() done 30575 1726867578.56603: dumping result to json 30575 1726867578.56605: done dumping result, returning 30575 1726867578.56607: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr [0affcac9-a3a5-e081-a588-000000000386] 30575 1726867578.56609: sending task result for task 0affcac9-a3a5-e081-a588-000000000386 30575 1726867578.56666: done sending task result for task 0affcac9-a3a5-e081-a588-000000000386 30575 1726867578.56669: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867578.56745: no more pending results, returning what we have 30575 1726867578.56748: results queue empty 30575 1726867578.56749: checking for any_errors_fatal 30575 1726867578.56754: done checking for any_errors_fatal 30575 1726867578.56755: checking for max_fail_percentage 30575 1726867578.56757: done checking for max_fail_percentage 30575 1726867578.56758: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.56758: done checking to see if all hosts have failed 30575 1726867578.56759: getting the remaining hosts for this loop 30575 1726867578.56761: done getting the remaining hosts for this loop 30575 1726867578.56765: getting the next task for host managed_node3 30575 1726867578.56773: done getting next task for host managed_node3 30575 1726867578.56779: ^ task is: TASK: Conditional asserts 30575 1726867578.56782: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.56787: getting variables 30575 1726867578.56788: in VariableManager get_vars() 30575 1726867578.56818: Calling all_inventory to load vars for managed_node3 30575 1726867578.56821: Calling groups_inventory to load vars for managed_node3 30575 1726867578.56825: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.56835: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.56838: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.56841: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.58920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.60575: done with get_vars() 30575 1726867578.60799: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:26:18 -0400 (0:00:00.065) 0:00:13.986 ****** 30575 1726867578.60895: entering _queue_task() for managed_node3/include_tasks 30575 1726867578.61229: worker is 1 (out of 1 available) 30575 1726867578.61239: exiting _queue_task() for managed_node3/include_tasks 30575 1726867578.61251: done queuing things up, now waiting for results queue to drain 30575 1726867578.61252: waiting for pending results... 30575 1726867578.61584: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30575 1726867578.61680: in run() - task 0affcac9-a3a5-e081-a588-000000000097 30575 1726867578.61685: variable 'ansible_search_path' from source: unknown 30575 1726867578.61688: variable 'ansible_search_path' from source: unknown 30575 1726867578.61926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867578.63994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867578.64022: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867578.64058: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867578.64098: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867578.64120: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867578.64414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867578.64486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867578.64490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867578.64505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867578.64519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867578.64825: variable 'lsr_assert_when' from source: include params 30575 1726867578.64929: variable 'network_provider' from source: set_fact 30575 1726867578.65199: variable 'omit' from source: magic vars 30575 1726867578.65321: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.65329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.65333: variable 'omit' from source: magic vars 30575 1726867578.65581: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.65584: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.65644: variable 'item' from source: unknown 30575 1726867578.65651: Evaluated conditional (item['condition']): True 30575 1726867578.65733: variable 'item' from source: unknown 30575 1726867578.65762: variable 'item' from source: unknown 30575 1726867578.65828: variable 'item' from source: unknown 30575 1726867578.66122: dumping result to json 30575 1726867578.66127: done dumping result, returning 30575 1726867578.66129: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-e081-a588-000000000097] 30575 1726867578.66131: sending task result for task 0affcac9-a3a5-e081-a588-000000000097 30575 1726867578.66170: done sending task result for task 0affcac9-a3a5-e081-a588-000000000097 30575 1726867578.66172: WORKER PROCESS EXITING 30575 1726867578.66248: no more pending results, returning what we have 30575 1726867578.66253: in VariableManager get_vars() 30575 1726867578.66285: Calling all_inventory to load vars for managed_node3 30575 1726867578.66288: Calling groups_inventory to load vars for managed_node3 30575 1726867578.66291: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.66301: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.66304: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.66307: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.67669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.70493: done with get_vars() 30575 1726867578.70516: variable 'ansible_search_path' from source: unknown 30575 1726867578.70518: variable 'ansible_search_path' from source: unknown 30575 1726867578.70561: we have included files to process 30575 1726867578.70562: generating all_blocks data 30575 1726867578.70673: done generating all_blocks data 30575 1726867578.70683: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867578.70684: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867578.70694: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867578.71042: in VariableManager get_vars() 30575 1726867578.71055: done with get_vars() 30575 1726867578.71144: done processing included file 30575 1726867578.71145: iterating over new_blocks loaded from include file 30575 1726867578.71146: in VariableManager get_vars() 30575 1726867578.71156: done with get_vars() 30575 1726867578.71157: filtering new block on tags 30575 1726867578.71180: done filtering new block on tags 30575 1726867578.71182: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 30575 1726867578.71186: extending task lists for all hosts with included blocks 30575 1726867578.71912: done extending task lists 30575 1726867578.71913: done processing included files 30575 1726867578.71913: results queue empty 30575 1726867578.71914: checking for any_errors_fatal 30575 1726867578.71917: done checking for any_errors_fatal 30575 1726867578.71917: checking for max_fail_percentage 30575 1726867578.71918: done checking for max_fail_percentage 30575 1726867578.71918: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.71919: done checking to see if all hosts have failed 30575 1726867578.71919: getting the remaining hosts for this loop 30575 1726867578.71920: done getting the remaining hosts for this loop 30575 1726867578.71922: getting the next task for host managed_node3 30575 1726867578.71925: done getting next task for host managed_node3 30575 1726867578.71927: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867578.71929: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.71935: getting variables 30575 1726867578.71936: in VariableManager get_vars() 30575 1726867578.71944: Calling all_inventory to load vars for managed_node3 30575 1726867578.71946: Calling groups_inventory to load vars for managed_node3 30575 1726867578.71948: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.71951: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.71953: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.71955: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.72674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.74486: done with get_vars() 30575 1726867578.74513: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:26:18 -0400 (0:00:00.137) 0:00:14.123 ****** 30575 1726867578.74597: entering _queue_task() for managed_node3/include_tasks 30575 1726867578.74931: worker is 1 (out of 1 available) 30575 1726867578.74948: exiting _queue_task() for managed_node3/include_tasks 30575 1726867578.74962: done queuing things up, now waiting for results queue to drain 30575 1726867578.74964: waiting for pending results... 30575 1726867578.75186: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867578.75313: in run() - task 0affcac9-a3a5-e081-a588-000000000452 30575 1726867578.75348: variable 'ansible_search_path' from source: unknown 30575 1726867578.75351: variable 'ansible_search_path' from source: unknown 30575 1726867578.75362: calling self._execute() 30575 1726867578.75483: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.75488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.75492: variable 'omit' from source: magic vars 30575 1726867578.75894: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.75911: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.75920: _execute() done 30575 1726867578.75937: dumping result to json 30575 1726867578.75953: done dumping result, returning 30575 1726867578.75970: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-000000000452] 30575 1726867578.75990: sending task result for task 0affcac9-a3a5-e081-a588-000000000452 30575 1726867578.76199: done sending task result for task 0affcac9-a3a5-e081-a588-000000000452 30575 1726867578.76203: WORKER PROCESS EXITING 30575 1726867578.76235: no more pending results, returning what we have 30575 1726867578.76240: in VariableManager get_vars() 30575 1726867578.76279: Calling all_inventory to load vars for managed_node3 30575 1726867578.76283: Calling groups_inventory to load vars for managed_node3 30575 1726867578.76287: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.76301: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.76304: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.76307: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.77799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.78759: done with get_vars() 30575 1726867578.78771: variable 'ansible_search_path' from source: unknown 30575 1726867578.78772: variable 'ansible_search_path' from source: unknown 30575 1726867578.78872: variable 'item' from source: include params 30575 1726867578.78901: we have included files to process 30575 1726867578.78902: generating all_blocks data 30575 1726867578.78903: done generating all_blocks data 30575 1726867578.78904: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867578.78905: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867578.78906: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867578.79029: done processing included file 30575 1726867578.79031: iterating over new_blocks loaded from include file 30575 1726867578.79031: in VariableManager get_vars() 30575 1726867578.79042: done with get_vars() 30575 1726867578.79044: filtering new block on tags 30575 1726867578.79059: done filtering new block on tags 30575 1726867578.79060: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867578.79063: extending task lists for all hosts with included blocks 30575 1726867578.79194: done extending task lists 30575 1726867578.79196: done processing included files 30575 1726867578.79196: results queue empty 30575 1726867578.79197: checking for any_errors_fatal 30575 1726867578.79200: done checking for any_errors_fatal 30575 1726867578.79201: checking for max_fail_percentage 30575 1726867578.79202: done checking for max_fail_percentage 30575 1726867578.79203: checking to see if all hosts have failed and the running result is not ok 30575 1726867578.79203: done checking to see if all hosts have failed 30575 1726867578.79204: getting the remaining hosts for this loop 30575 1726867578.79205: done getting the remaining hosts for this loop 30575 1726867578.79208: getting the next task for host managed_node3 30575 1726867578.79212: done getting next task for host managed_node3 30575 1726867578.79214: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867578.79217: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867578.79219: getting variables 30575 1726867578.79220: in VariableManager get_vars() 30575 1726867578.79231: Calling all_inventory to load vars for managed_node3 30575 1726867578.79234: Calling groups_inventory to load vars for managed_node3 30575 1726867578.79236: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867578.79241: Calling all_plugins_play to load vars for managed_node3 30575 1726867578.79243: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867578.79246: Calling groups_plugins_play to load vars for managed_node3 30575 1726867578.80849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867578.82219: done with get_vars() 30575 1726867578.82238: done getting variables 30575 1726867578.82361: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:26:18 -0400 (0:00:00.077) 0:00:14.201 ****** 30575 1726867578.82395: entering _queue_task() for managed_node3/stat 30575 1726867578.82629: worker is 1 (out of 1 available) 30575 1726867578.82643: exiting _queue_task() for managed_node3/stat 30575 1726867578.82655: done queuing things up, now waiting for results queue to drain 30575 1726867578.82657: waiting for pending results... 30575 1726867578.82843: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867578.82926: in run() - task 0affcac9-a3a5-e081-a588-0000000004e8 30575 1726867578.82936: variable 'ansible_search_path' from source: unknown 30575 1726867578.82939: variable 'ansible_search_path' from source: unknown 30575 1726867578.82966: calling self._execute() 30575 1726867578.83038: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.83042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.83051: variable 'omit' from source: magic vars 30575 1726867578.83309: variable 'ansible_distribution_major_version' from source: facts 30575 1726867578.83319: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867578.83327: variable 'omit' from source: magic vars 30575 1726867578.83364: variable 'omit' from source: magic vars 30575 1726867578.83449: variable 'interface' from source: play vars 30575 1726867578.83467: variable 'omit' from source: magic vars 30575 1726867578.83516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867578.83573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867578.83602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867578.83609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.83635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867578.83656: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867578.83659: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.83671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.83786: Set connection var ansible_pipelining to False 30575 1726867578.83789: Set connection var ansible_shell_type to sh 30575 1726867578.83792: Set connection var ansible_shell_executable to /bin/sh 30575 1726867578.83795: Set connection var ansible_timeout to 10 30575 1726867578.83797: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867578.83841: Set connection var ansible_connection to ssh 30575 1726867578.83845: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.83848: variable 'ansible_connection' from source: unknown 30575 1726867578.83851: variable 'ansible_module_compression' from source: unknown 30575 1726867578.83853: variable 'ansible_shell_type' from source: unknown 30575 1726867578.83860: variable 'ansible_shell_executable' from source: unknown 30575 1726867578.83863: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867578.83865: variable 'ansible_pipelining' from source: unknown 30575 1726867578.83867: variable 'ansible_timeout' from source: unknown 30575 1726867578.83869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867578.84108: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867578.84113: variable 'omit' from source: magic vars 30575 1726867578.84116: starting attempt loop 30575 1726867578.84118: running the handler 30575 1726867578.84120: _low_level_execute_command(): starting 30575 1726867578.84136: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867578.85459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867578.85564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867578.85728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867578.87407: stdout chunk (state=3): >>>/root <<< 30575 1726867578.87519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867578.87525: stdout chunk (state=3): >>><<< 30575 1726867578.87536: stderr chunk (state=3): >>><<< 30575 1726867578.87575: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867578.87588: _low_level_execute_command(): starting 30575 1726867578.87592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477 `" && echo ansible-tmp-1726867578.8756344-31200-193360438095477="` echo /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477 `" ) && sleep 0' 30575 1726867578.88331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867578.88334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867578.88341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867578.88344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867578.88346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867578.88440: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867578.88471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867578.88792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867578.90633: stdout chunk (state=3): >>>ansible-tmp-1726867578.8756344-31200-193360438095477=/root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477 <<< 30575 1726867578.90733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867578.90770: stderr chunk (state=3): >>><<< 30575 1726867578.90773: stdout chunk (state=3): >>><<< 30575 1726867578.90786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867578.8756344-31200-193360438095477=/root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867578.90836: variable 'ansible_module_compression' from source: unknown 30575 1726867578.90905: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867578.90953: variable 'ansible_facts' from source: unknown 30575 1726867578.91018: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/AnsiballZ_stat.py 30575 1726867578.91113: Sending initial data 30575 1726867578.91119: Sent initial data (153 bytes) 30575 1726867578.91576: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867578.91583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867578.91585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867578.91587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867578.91590: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867578.91646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867578.91650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867578.91654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867578.91716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867578.93250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867578.93285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867578.93329: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpry6roski /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/AnsiballZ_stat.py <<< 30575 1726867578.93340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/AnsiballZ_stat.py" <<< 30575 1726867578.93372: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpry6roski" to remote "/root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/AnsiballZ_stat.py" <<< 30575 1726867578.93918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867578.93951: stderr chunk (state=3): >>><<< 30575 1726867578.93954: stdout chunk (state=3): >>><<< 30575 1726867578.93975: done transferring module to remote 30575 1726867578.93984: _low_level_execute_command(): starting 30575 1726867578.93991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/ /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/AnsiballZ_stat.py && sleep 0' 30575 1726867578.94403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867578.94407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867578.94409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867578.94411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867578.94413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867578.94463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867578.94466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867578.94514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867578.96230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867578.96250: stderr chunk (state=3): >>><<< 30575 1726867578.96254: stdout chunk (state=3): >>><<< 30575 1726867578.96271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867578.96274: _low_level_execute_command(): starting 30575 1726867578.96279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/AnsiballZ_stat.py && sleep 0' 30575 1726867578.96647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867578.96674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867578.96679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867578.96681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867578.96684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867578.96686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867578.96737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867578.96744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867578.96829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.11993: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31194, "dev": 23, "nlink": 1, "atime": 1726867575.8552756, "mtime": 1726867575.8552756, "ctime": 1726867575.8552756, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867579.13259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867579.13315: stderr chunk (state=3): >>><<< 30575 1726867579.13318: stdout chunk (state=3): >>><<< 30575 1726867579.13352: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31194, "dev": 23, "nlink": 1, "atime": 1726867575.8552756, "mtime": 1726867575.8552756, "ctime": 1726867575.8552756, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867579.13419: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867579.13443: _low_level_execute_command(): starting 30575 1726867579.13446: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867578.8756344-31200-193360438095477/ > /dev/null 2>&1 && sleep 0' 30575 1726867579.14152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867579.14162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867579.14205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867579.14244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.16063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867579.16088: stderr chunk (state=3): >>><<< 30575 1726867579.16091: stdout chunk (state=3): >>><<< 30575 1726867579.16108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867579.16113: handler run complete 30575 1726867579.16145: attempt loop complete, returning result 30575 1726867579.16149: _execute() done 30575 1726867579.16152: dumping result to json 30575 1726867579.16156: done dumping result, returning 30575 1726867579.16163: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-0000000004e8] 30575 1726867579.16167: sending task result for task 0affcac9-a3a5-e081-a588-0000000004e8 30575 1726867579.16269: done sending task result for task 0affcac9-a3a5-e081-a588-0000000004e8 30575 1726867579.16272: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867575.8552756, "block_size": 4096, "blocks": 0, "ctime": 1726867575.8552756, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31194, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726867575.8552756, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30575 1726867579.16362: no more pending results, returning what we have 30575 1726867579.16365: results queue empty 30575 1726867579.16366: checking for any_errors_fatal 30575 1726867579.16368: done checking for any_errors_fatal 30575 1726867579.16368: checking for max_fail_percentage 30575 1726867579.16370: done checking for max_fail_percentage 30575 1726867579.16371: checking to see if all hosts have failed and the running result is not ok 30575 1726867579.16372: done checking to see if all hosts have failed 30575 1726867579.16372: getting the remaining hosts for this loop 30575 1726867579.16374: done getting the remaining hosts for this loop 30575 1726867579.16381: getting the next task for host managed_node3 30575 1726867579.16390: done getting next task for host managed_node3 30575 1726867579.16392: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30575 1726867579.16396: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867579.16401: getting variables 30575 1726867579.16402: in VariableManager get_vars() 30575 1726867579.16433: Calling all_inventory to load vars for managed_node3 30575 1726867579.16436: Calling groups_inventory to load vars for managed_node3 30575 1726867579.16439: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867579.16449: Calling all_plugins_play to load vars for managed_node3 30575 1726867579.16451: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867579.16453: Calling groups_plugins_play to load vars for managed_node3 30575 1726867579.17496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867579.18650: done with get_vars() 30575 1726867579.18670: done getting variables 30575 1726867579.18717: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867579.18818: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:26:19 -0400 (0:00:00.364) 0:00:14.565 ****** 30575 1726867579.18843: entering _queue_task() for managed_node3/assert 30575 1726867579.19109: worker is 1 (out of 1 available) 30575 1726867579.19121: exiting _queue_task() for managed_node3/assert 30575 1726867579.19136: done queuing things up, now waiting for results queue to drain 30575 1726867579.19138: waiting for pending results... 30575 1726867579.19401: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' 30575 1726867579.19469: in run() - task 0affcac9-a3a5-e081-a588-000000000453 30575 1726867579.19485: variable 'ansible_search_path' from source: unknown 30575 1726867579.19489: variable 'ansible_search_path' from source: unknown 30575 1726867579.19541: calling self._execute() 30575 1726867579.19614: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.19618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.19622: variable 'omit' from source: magic vars 30575 1726867579.19902: variable 'ansible_distribution_major_version' from source: facts 30575 1726867579.19911: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867579.19919: variable 'omit' from source: magic vars 30575 1726867579.19995: variable 'omit' from source: magic vars 30575 1726867579.20073: variable 'interface' from source: play vars 30575 1726867579.20107: variable 'omit' from source: magic vars 30575 1726867579.20141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867579.20188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867579.20208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867579.20231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867579.20241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867579.20263: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867579.20266: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.20269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.20356: Set connection var ansible_pipelining to False 30575 1726867579.20359: Set connection var ansible_shell_type to sh 30575 1726867579.20364: Set connection var ansible_shell_executable to /bin/sh 30575 1726867579.20369: Set connection var ansible_timeout to 10 30575 1726867579.20374: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867579.20382: Set connection var ansible_connection to ssh 30575 1726867579.20399: variable 'ansible_shell_executable' from source: unknown 30575 1726867579.20402: variable 'ansible_connection' from source: unknown 30575 1726867579.20405: variable 'ansible_module_compression' from source: unknown 30575 1726867579.20407: variable 'ansible_shell_type' from source: unknown 30575 1726867579.20409: variable 'ansible_shell_executable' from source: unknown 30575 1726867579.20411: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.20429: variable 'ansible_pipelining' from source: unknown 30575 1726867579.20432: variable 'ansible_timeout' from source: unknown 30575 1726867579.20434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.20537: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867579.20545: variable 'omit' from source: magic vars 30575 1726867579.20551: starting attempt loop 30575 1726867579.20554: running the handler 30575 1726867579.20673: variable 'interface_stat' from source: set_fact 30575 1726867579.20691: Evaluated conditional (interface_stat.stat.exists): True 30575 1726867579.20695: handler run complete 30575 1726867579.20706: attempt loop complete, returning result 30575 1726867579.20709: _execute() done 30575 1726867579.20711: dumping result to json 30575 1726867579.20713: done dumping result, returning 30575 1726867579.20720: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' [0affcac9-a3a5-e081-a588-000000000453] 30575 1726867579.20726: sending task result for task 0affcac9-a3a5-e081-a588-000000000453 30575 1726867579.20804: done sending task result for task 0affcac9-a3a5-e081-a588-000000000453 30575 1726867579.20807: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867579.20861: no more pending results, returning what we have 30575 1726867579.20867: results queue empty 30575 1726867579.20868: checking for any_errors_fatal 30575 1726867579.20881: done checking for any_errors_fatal 30575 1726867579.20882: checking for max_fail_percentage 30575 1726867579.20883: done checking for max_fail_percentage 30575 1726867579.20884: checking to see if all hosts have failed and the running result is not ok 30575 1726867579.20885: done checking to see if all hosts have failed 30575 1726867579.20886: getting the remaining hosts for this loop 30575 1726867579.20887: done getting the remaining hosts for this loop 30575 1726867579.20891: getting the next task for host managed_node3 30575 1726867579.20903: done getting next task for host managed_node3 30575 1726867579.20906: ^ task is: TASK: Success in test '{{ lsr_description }}' 30575 1726867579.20909: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867579.20913: getting variables 30575 1726867579.20914: in VariableManager get_vars() 30575 1726867579.20968: Calling all_inventory to load vars for managed_node3 30575 1726867579.20971: Calling groups_inventory to load vars for managed_node3 30575 1726867579.20974: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867579.20985: Calling all_plugins_play to load vars for managed_node3 30575 1726867579.20987: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867579.20990: Calling groups_plugins_play to load vars for managed_node3 30575 1726867579.22002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867579.23052: done with get_vars() 30575 1726867579.23067: done getting variables 30575 1726867579.23149: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867579.23240: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:26:19 -0400 (0:00:00.044) 0:00:14.610 ****** 30575 1726867579.23265: entering _queue_task() for managed_node3/debug 30575 1726867579.23539: worker is 1 (out of 1 available) 30575 1726867579.23557: exiting _queue_task() for managed_node3/debug 30575 1726867579.23571: done queuing things up, now waiting for results queue to drain 30575 1726867579.23573: waiting for pending results... 30575 1726867579.23799: running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile' 30575 1726867579.23874: in run() - task 0affcac9-a3a5-e081-a588-000000000098 30575 1726867579.23887: variable 'ansible_search_path' from source: unknown 30575 1726867579.23890: variable 'ansible_search_path' from source: unknown 30575 1726867579.23921: calling self._execute() 30575 1726867579.23993: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.23996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.24005: variable 'omit' from source: magic vars 30575 1726867579.24275: variable 'ansible_distribution_major_version' from source: facts 30575 1726867579.24286: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867579.24292: variable 'omit' from source: magic vars 30575 1726867579.24319: variable 'omit' from source: magic vars 30575 1726867579.24392: variable 'lsr_description' from source: include params 30575 1726867579.24408: variable 'omit' from source: magic vars 30575 1726867579.24440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867579.24471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867579.24490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867579.24503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867579.24513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867579.24539: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867579.24542: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.24545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.24637: Set connection var ansible_pipelining to False 30575 1726867579.24640: Set connection var ansible_shell_type to sh 30575 1726867579.24645: Set connection var ansible_shell_executable to /bin/sh 30575 1726867579.24650: Set connection var ansible_timeout to 10 30575 1726867579.24655: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867579.24662: Set connection var ansible_connection to ssh 30575 1726867579.24684: variable 'ansible_shell_executable' from source: unknown 30575 1726867579.24689: variable 'ansible_connection' from source: unknown 30575 1726867579.24693: variable 'ansible_module_compression' from source: unknown 30575 1726867579.24696: variable 'ansible_shell_type' from source: unknown 30575 1726867579.24698: variable 'ansible_shell_executable' from source: unknown 30575 1726867579.24700: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.24703: variable 'ansible_pipelining' from source: unknown 30575 1726867579.24705: variable 'ansible_timeout' from source: unknown 30575 1726867579.24707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.24806: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867579.24821: variable 'omit' from source: magic vars 30575 1726867579.24831: starting attempt loop 30575 1726867579.24834: running the handler 30575 1726867579.24861: handler run complete 30575 1726867579.24872: attempt loop complete, returning result 30575 1726867579.24875: _execute() done 30575 1726867579.24879: dumping result to json 30575 1726867579.24882: done dumping result, returning 30575 1726867579.24889: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile' [0affcac9-a3a5-e081-a588-000000000098] 30575 1726867579.24896: sending task result for task 0affcac9-a3a5-e081-a588-000000000098 30575 1726867579.24978: done sending task result for task 0affcac9-a3a5-e081-a588-000000000098 30575 1726867579.24981: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 30575 1726867579.25037: no more pending results, returning what we have 30575 1726867579.25040: results queue empty 30575 1726867579.25041: checking for any_errors_fatal 30575 1726867579.25045: done checking for any_errors_fatal 30575 1726867579.25045: checking for max_fail_percentage 30575 1726867579.25047: done checking for max_fail_percentage 30575 1726867579.25047: checking to see if all hosts have failed and the running result is not ok 30575 1726867579.25048: done checking to see if all hosts have failed 30575 1726867579.25049: getting the remaining hosts for this loop 30575 1726867579.25050: done getting the remaining hosts for this loop 30575 1726867579.25055: getting the next task for host managed_node3 30575 1726867579.25063: done getting next task for host managed_node3 30575 1726867579.25066: ^ task is: TASK: Cleanup 30575 1726867579.25069: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867579.25075: getting variables 30575 1726867579.25076: in VariableManager get_vars() 30575 1726867579.25113: Calling all_inventory to load vars for managed_node3 30575 1726867579.25116: Calling groups_inventory to load vars for managed_node3 30575 1726867579.25119: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867579.25132: Calling all_plugins_play to load vars for managed_node3 30575 1726867579.25134: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867579.25137: Calling groups_plugins_play to load vars for managed_node3 30575 1726867579.29593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867579.30918: done with get_vars() 30575 1726867579.30943: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:26:19 -0400 (0:00:00.077) 0:00:14.687 ****** 30575 1726867579.31044: entering _queue_task() for managed_node3/include_tasks 30575 1726867579.31328: worker is 1 (out of 1 available) 30575 1726867579.31340: exiting _queue_task() for managed_node3/include_tasks 30575 1726867579.31353: done queuing things up, now waiting for results queue to drain 30575 1726867579.31355: waiting for pending results... 30575 1726867579.31602: running TaskExecutor() for managed_node3/TASK: Cleanup 30575 1726867579.31686: in run() - task 0affcac9-a3a5-e081-a588-00000000009c 30575 1726867579.31706: variable 'ansible_search_path' from source: unknown 30575 1726867579.31710: variable 'ansible_search_path' from source: unknown 30575 1726867579.31785: variable 'lsr_cleanup' from source: include params 30575 1726867579.32075: variable 'lsr_cleanup' from source: include params 30575 1726867579.32166: variable 'omit' from source: magic vars 30575 1726867579.32393: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.32437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.32489: variable 'omit' from source: magic vars 30575 1726867579.33062: variable 'ansible_distribution_major_version' from source: facts 30575 1726867579.33066: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867579.33068: variable 'item' from source: unknown 30575 1726867579.33131: variable 'item' from source: unknown 30575 1726867579.33170: variable 'item' from source: unknown 30575 1726867579.33318: variable 'item' from source: unknown 30575 1726867579.33769: dumping result to json 30575 1726867579.33773: done dumping result, returning 30575 1726867579.33775: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-e081-a588-00000000009c] 30575 1726867579.33818: sending task result for task 0affcac9-a3a5-e081-a588-00000000009c 30575 1726867579.33875: done sending task result for task 0affcac9-a3a5-e081-a588-00000000009c 30575 1726867579.33880: WORKER PROCESS EXITING 30575 1726867579.33910: no more pending results, returning what we have 30575 1726867579.33915: in VariableManager get_vars() 30575 1726867579.33960: Calling all_inventory to load vars for managed_node3 30575 1726867579.33963: Calling groups_inventory to load vars for managed_node3 30575 1726867579.33970: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867579.33989: Calling all_plugins_play to load vars for managed_node3 30575 1726867579.33993: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867579.33996: Calling groups_plugins_play to load vars for managed_node3 30575 1726867579.35759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867579.38574: done with get_vars() 30575 1726867579.38599: variable 'ansible_search_path' from source: unknown 30575 1726867579.38600: variable 'ansible_search_path' from source: unknown 30575 1726867579.38686: we have included files to process 30575 1726867579.38688: generating all_blocks data 30575 1726867579.38690: done generating all_blocks data 30575 1726867579.38694: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867579.38695: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867579.38698: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867579.39332: done processing included file 30575 1726867579.39338: iterating over new_blocks loaded from include file 30575 1726867579.39339: in VariableManager get_vars() 30575 1726867579.39355: done with get_vars() 30575 1726867579.39356: filtering new block on tags 30575 1726867579.39389: done filtering new block on tags 30575 1726867579.39392: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30575 1726867579.39397: extending task lists for all hosts with included blocks 30575 1726867579.41711: done extending task lists 30575 1726867579.41712: done processing included files 30575 1726867579.41712: results queue empty 30575 1726867579.41713: checking for any_errors_fatal 30575 1726867579.41716: done checking for any_errors_fatal 30575 1726867579.41717: checking for max_fail_percentage 30575 1726867579.41717: done checking for max_fail_percentage 30575 1726867579.41718: checking to see if all hosts have failed and the running result is not ok 30575 1726867579.41718: done checking to see if all hosts have failed 30575 1726867579.41719: getting the remaining hosts for this loop 30575 1726867579.41720: done getting the remaining hosts for this loop 30575 1726867579.41721: getting the next task for host managed_node3 30575 1726867579.41725: done getting next task for host managed_node3 30575 1726867579.41727: ^ task is: TASK: Cleanup profile and device 30575 1726867579.41729: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867579.41731: getting variables 30575 1726867579.41731: in VariableManager get_vars() 30575 1726867579.41739: Calling all_inventory to load vars for managed_node3 30575 1726867579.41741: Calling groups_inventory to load vars for managed_node3 30575 1726867579.41742: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867579.41746: Calling all_plugins_play to load vars for managed_node3 30575 1726867579.41748: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867579.41749: Calling groups_plugins_play to load vars for managed_node3 30575 1726867579.42487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867579.44232: done with get_vars() 30575 1726867579.44251: done getting variables 30575 1726867579.44322: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 17:26:19 -0400 (0:00:00.133) 0:00:14.821 ****** 30575 1726867579.44350: entering _queue_task() for managed_node3/shell 30575 1726867579.45153: worker is 1 (out of 1 available) 30575 1726867579.45166: exiting _queue_task() for managed_node3/shell 30575 1726867579.45281: done queuing things up, now waiting for results queue to drain 30575 1726867579.45283: waiting for pending results... 30575 1726867579.45775: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30575 1726867579.45783: in run() - task 0affcac9-a3a5-e081-a588-00000000050b 30575 1726867579.45786: variable 'ansible_search_path' from source: unknown 30575 1726867579.45790: variable 'ansible_search_path' from source: unknown 30575 1726867579.46090: calling self._execute() 30575 1726867579.46093: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.46096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.46099: variable 'omit' from source: magic vars 30575 1726867579.46537: variable 'ansible_distribution_major_version' from source: facts 30575 1726867579.46556: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867579.46566: variable 'omit' from source: magic vars 30575 1726867579.46616: variable 'omit' from source: magic vars 30575 1726867579.46805: variable 'interface' from source: play vars 30575 1726867579.46833: variable 'omit' from source: magic vars 30575 1726867579.46894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867579.46962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867579.46982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867579.47006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867579.47071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867579.47074: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867579.47087: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.47097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.47215: Set connection var ansible_pipelining to False 30575 1726867579.47227: Set connection var ansible_shell_type to sh 30575 1726867579.47239: Set connection var ansible_shell_executable to /bin/sh 30575 1726867579.47289: Set connection var ansible_timeout to 10 30575 1726867579.47292: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867579.47300: Set connection var ansible_connection to ssh 30575 1726867579.47311: variable 'ansible_shell_executable' from source: unknown 30575 1726867579.47319: variable 'ansible_connection' from source: unknown 30575 1726867579.47330: variable 'ansible_module_compression' from source: unknown 30575 1726867579.47338: variable 'ansible_shell_type' from source: unknown 30575 1726867579.47345: variable 'ansible_shell_executable' from source: unknown 30575 1726867579.47396: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.47399: variable 'ansible_pipelining' from source: unknown 30575 1726867579.47412: variable 'ansible_timeout' from source: unknown 30575 1726867579.47415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.47548: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867579.47563: variable 'omit' from source: magic vars 30575 1726867579.47573: starting attempt loop 30575 1726867579.47581: running the handler 30575 1726867579.47615: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867579.47637: _low_level_execute_command(): starting 30575 1726867579.47650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867579.48570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867579.48586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867579.48721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867579.48958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.50679: stdout chunk (state=3): >>>/root <<< 30575 1726867579.50816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867579.50831: stderr chunk (state=3): >>><<< 30575 1726867579.50847: stdout chunk (state=3): >>><<< 30575 1726867579.50888: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867579.50891: _low_level_execute_command(): starting 30575 1726867579.50903: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891 `" && echo ansible-tmp-1726867579.5086627-31221-219186402455891="` echo /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891 `" ) && sleep 0' 30575 1726867579.51464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867579.51590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867579.51606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867579.51629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867579.51661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867579.51907: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867579.51922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867579.51942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867579.51984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867579.52061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.53990: stdout chunk (state=3): >>>ansible-tmp-1726867579.5086627-31221-219186402455891=/root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891 <<< 30575 1726867579.54136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867579.54140: stdout chunk (state=3): >>><<< 30575 1726867579.54142: stderr chunk (state=3): >>><<< 30575 1726867579.54282: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867579.5086627-31221-219186402455891=/root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867579.54286: variable 'ansible_module_compression' from source: unknown 30575 1726867579.54288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867579.54302: variable 'ansible_facts' from source: unknown 30575 1726867579.54372: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/AnsiballZ_command.py 30575 1726867579.54531: Sending initial data 30575 1726867579.54540: Sent initial data (156 bytes) 30575 1726867579.55129: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867579.55181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867579.55200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867579.55286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867579.55305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867579.55375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.56913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867579.57183: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867579.57232: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3_biwelk /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/AnsiballZ_command.py <<< 30575 1726867579.57242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/AnsiballZ_command.py" <<< 30575 1726867579.57275: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3_biwelk" to remote "/root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/AnsiballZ_command.py" <<< 30575 1726867579.58785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867579.58789: stdout chunk (state=3): >>><<< 30575 1726867579.58791: stderr chunk (state=3): >>><<< 30575 1726867579.58793: done transferring module to remote 30575 1726867579.58795: _low_level_execute_command(): starting 30575 1726867579.58797: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/ /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/AnsiballZ_command.py && sleep 0' 30575 1726867579.59885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867579.59920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867579.60047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867579.60096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.61895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867579.61899: stdout chunk (state=3): >>><<< 30575 1726867579.61907: stderr chunk (state=3): >>><<< 30575 1726867579.61925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867579.61928: _low_level_execute_command(): starting 30575 1726867579.61931: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/AnsiballZ_command.py && sleep 0' 30575 1726867579.63052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867579.63182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867579.63476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867579.63862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.84694: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (f5796ae9-39ec-4c12-a218-e4d84e010b7f) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:26:19.787787", "end": "2024-09-20 17:26:19.843446", "delta": "0:00:00.055659", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867579.86210: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867579.86214: stdout chunk (state=3): >>><<< 30575 1726867579.86220: stderr chunk (state=3): >>><<< 30575 1726867579.86244: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (f5796ae9-39ec-4c12-a218-e4d84e010b7f) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:26:19.787787", "end": "2024-09-20 17:26:19.843446", "delta": "0:00:00.055659", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867579.86282: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867579.86384: _low_level_execute_command(): starting 30575 1726867579.86387: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867579.5086627-31221-219186402455891/ > /dev/null 2>&1 && sleep 0' 30575 1726867579.86948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867579.86954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867579.86975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867579.86990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867579.87023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867579.87027: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867579.87029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867579.87044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867579.87050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867579.87090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867579.87152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867579.87166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867579.87288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867579.89282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867579.89285: stdout chunk (state=3): >>><<< 30575 1726867579.89287: stderr chunk (state=3): >>><<< 30575 1726867579.89290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867579.89292: handler run complete 30575 1726867579.89294: Evaluated conditional (False): False 30575 1726867579.89297: attempt loop complete, returning result 30575 1726867579.89299: _execute() done 30575 1726867579.89301: dumping result to json 30575 1726867579.89302: done dumping result, returning 30575 1726867579.89311: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcac9-a3a5-e081-a588-00000000050b] 30575 1726867579.89316: sending task result for task 0affcac9-a3a5-e081-a588-00000000050b 30575 1726867579.89425: done sending task result for task 0affcac9-a3a5-e081-a588-00000000050b 30575 1726867579.89428: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.055659", "end": "2024-09-20 17:26:19.843446", "rc": 1, "start": "2024-09-20 17:26:19.787787" } STDOUT: Connection 'statebr' (f5796ae9-39ec-4c12-a218-e4d84e010b7f) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30575 1726867579.89505: no more pending results, returning what we have 30575 1726867579.89508: results queue empty 30575 1726867579.89640: checking for any_errors_fatal 30575 1726867579.89642: done checking for any_errors_fatal 30575 1726867579.89643: checking for max_fail_percentage 30575 1726867579.89645: done checking for max_fail_percentage 30575 1726867579.89646: checking to see if all hosts have failed and the running result is not ok 30575 1726867579.89647: done checking to see if all hosts have failed 30575 1726867579.89648: getting the remaining hosts for this loop 30575 1726867579.89649: done getting the remaining hosts for this loop 30575 1726867579.89653: getting the next task for host managed_node3 30575 1726867579.89665: done getting next task for host managed_node3 30575 1726867579.89670: ^ task is: TASK: Include the task 'run_test.yml' 30575 1726867579.89672: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867579.89678: getting variables 30575 1726867579.89680: in VariableManager get_vars() 30575 1726867579.89714: Calling all_inventory to load vars for managed_node3 30575 1726867579.89716: Calling groups_inventory to load vars for managed_node3 30575 1726867579.89720: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867579.89735: Calling all_plugins_play to load vars for managed_node3 30575 1726867579.89739: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867579.89821: Calling groups_plugins_play to load vars for managed_node3 30575 1726867579.91845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867579.94047: done with get_vars() 30575 1726867579.94111: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Friday 20 September 2024 17:26:19 -0400 (0:00:00.501) 0:00:15.322 ****** 30575 1726867579.94539: entering _queue_task() for managed_node3/include_tasks 30575 1726867579.95520: worker is 1 (out of 1 available) 30575 1726867579.95610: exiting _queue_task() for managed_node3/include_tasks 30575 1726867579.95739: done queuing things up, now waiting for results queue to drain 30575 1726867579.95741: waiting for pending results... 30575 1726867579.96078: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30575 1726867579.96585: in run() - task 0affcac9-a3a5-e081-a588-00000000000f 30575 1726867579.96590: variable 'ansible_search_path' from source: unknown 30575 1726867579.96592: calling self._execute() 30575 1726867579.96645: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867579.97083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867579.97086: variable 'omit' from source: magic vars 30575 1726867579.97883: variable 'ansible_distribution_major_version' from source: facts 30575 1726867579.97886: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867579.97889: _execute() done 30575 1726867579.97891: dumping result to json 30575 1726867579.97893: done dumping result, returning 30575 1726867579.97895: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-e081-a588-00000000000f] 30575 1726867579.97896: sending task result for task 0affcac9-a3a5-e081-a588-00000000000f 30575 1726867579.97963: done sending task result for task 0affcac9-a3a5-e081-a588-00000000000f 30575 1726867579.97967: WORKER PROCESS EXITING 30575 1726867579.98074: no more pending results, returning what we have 30575 1726867579.98083: in VariableManager get_vars() 30575 1726867579.98122: Calling all_inventory to load vars for managed_node3 30575 1726867579.98128: Calling groups_inventory to load vars for managed_node3 30575 1726867579.98133: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867579.98147: Calling all_plugins_play to load vars for managed_node3 30575 1726867579.98151: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867579.98154: Calling groups_plugins_play to load vars for managed_node3 30575 1726867579.99671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.01926: done with get_vars() 30575 1726867580.01945: variable 'ansible_search_path' from source: unknown 30575 1726867580.01958: we have included files to process 30575 1726867580.01959: generating all_blocks data 30575 1726867580.01961: done generating all_blocks data 30575 1726867580.01966: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867580.01967: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867580.01970: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867580.02521: in VariableManager get_vars() 30575 1726867580.02552: done with get_vars() 30575 1726867580.02981: in VariableManager get_vars() 30575 1726867580.03003: done with get_vars() 30575 1726867580.03045: in VariableManager get_vars() 30575 1726867580.03060: done with get_vars() 30575 1726867580.03101: in VariableManager get_vars() 30575 1726867580.03120: done with get_vars() 30575 1726867580.03159: in VariableManager get_vars() 30575 1726867580.03173: done with get_vars() 30575 1726867580.03569: in VariableManager get_vars() 30575 1726867580.03586: done with get_vars() 30575 1726867580.03598: done processing included file 30575 1726867580.03599: iterating over new_blocks loaded from include file 30575 1726867580.03601: in VariableManager get_vars() 30575 1726867580.03610: done with get_vars() 30575 1726867580.03611: filtering new block on tags 30575 1726867580.03705: done filtering new block on tags 30575 1726867580.03708: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30575 1726867580.03713: extending task lists for all hosts with included blocks 30575 1726867580.03745: done extending task lists 30575 1726867580.03746: done processing included files 30575 1726867580.03747: results queue empty 30575 1726867580.03748: checking for any_errors_fatal 30575 1726867580.03752: done checking for any_errors_fatal 30575 1726867580.03753: checking for max_fail_percentage 30575 1726867580.03754: done checking for max_fail_percentage 30575 1726867580.03755: checking to see if all hosts have failed and the running result is not ok 30575 1726867580.03756: done checking to see if all hosts have failed 30575 1726867580.03756: getting the remaining hosts for this loop 30575 1726867580.03758: done getting the remaining hosts for this loop 30575 1726867580.03760: getting the next task for host managed_node3 30575 1726867580.03764: done getting next task for host managed_node3 30575 1726867580.03766: ^ task is: TASK: TEST: {{ lsr_description }} 30575 1726867580.03768: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867580.03770: getting variables 30575 1726867580.03771: in VariableManager get_vars() 30575 1726867580.03781: Calling all_inventory to load vars for managed_node3 30575 1726867580.03783: Calling groups_inventory to load vars for managed_node3 30575 1726867580.03785: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.03790: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.03793: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.03796: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.04955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.06482: done with get_vars() 30575 1726867580.06506: done getting variables 30575 1726867580.06551: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867580.06674: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:26:20 -0400 (0:00:00.122) 0:00:15.444 ****** 30575 1726867580.06706: entering _queue_task() for managed_node3/debug 30575 1726867580.07031: worker is 1 (out of 1 available) 30575 1726867580.07042: exiting _queue_task() for managed_node3/debug 30575 1726867580.07053: done queuing things up, now waiting for results queue to drain 30575 1726867580.07055: waiting for pending results... 30575 1726867580.07322: running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile without autoconnect 30575 1726867580.07439: in run() - task 0affcac9-a3a5-e081-a588-0000000005b4 30575 1726867580.07460: variable 'ansible_search_path' from source: unknown 30575 1726867580.07468: variable 'ansible_search_path' from source: unknown 30575 1726867580.07514: calling self._execute() 30575 1726867580.07614: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.07625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.07638: variable 'omit' from source: magic vars 30575 1726867580.07996: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.08013: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.08025: variable 'omit' from source: magic vars 30575 1726867580.08070: variable 'omit' from source: magic vars 30575 1726867580.08194: variable 'lsr_description' from source: include params 30575 1726867580.08223: variable 'omit' from source: magic vars 30575 1726867580.08284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867580.08333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.08365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867580.08397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.08419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.08456: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.08466: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.08479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.08600: Set connection var ansible_pipelining to False 30575 1726867580.08611: Set connection var ansible_shell_type to sh 30575 1726867580.08623: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.08641: Set connection var ansible_timeout to 10 30575 1726867580.08654: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.08667: Set connection var ansible_connection to ssh 30575 1726867580.08700: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.08707: variable 'ansible_connection' from source: unknown 30575 1726867580.08715: variable 'ansible_module_compression' from source: unknown 30575 1726867580.08724: variable 'ansible_shell_type' from source: unknown 30575 1726867580.08734: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.08742: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.08751: variable 'ansible_pipelining' from source: unknown 30575 1726867580.08758: variable 'ansible_timeout' from source: unknown 30575 1726867580.08770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.08921: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.09020: variable 'omit' from source: magic vars 30575 1726867580.09024: starting attempt loop 30575 1726867580.09026: running the handler 30575 1726867580.09028: handler run complete 30575 1726867580.09030: attempt loop complete, returning result 30575 1726867580.09032: _execute() done 30575 1726867580.09035: dumping result to json 30575 1726867580.09041: done dumping result, returning 30575 1726867580.09053: done running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile without autoconnect [0affcac9-a3a5-e081-a588-0000000005b4] 30575 1726867580.09062: sending task result for task 0affcac9-a3a5-e081-a588-0000000005b4 30575 1726867580.09386: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005b4 30575 1726867580.09391: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can create a profile without autoconnect ########## 30575 1726867580.09447: no more pending results, returning what we have 30575 1726867580.09451: results queue empty 30575 1726867580.09453: checking for any_errors_fatal 30575 1726867580.09456: done checking for any_errors_fatal 30575 1726867580.09456: checking for max_fail_percentage 30575 1726867580.09458: done checking for max_fail_percentage 30575 1726867580.09459: checking to see if all hosts have failed and the running result is not ok 30575 1726867580.09460: done checking to see if all hosts have failed 30575 1726867580.09461: getting the remaining hosts for this loop 30575 1726867580.09462: done getting the remaining hosts for this loop 30575 1726867580.09467: getting the next task for host managed_node3 30575 1726867580.09473: done getting next task for host managed_node3 30575 1726867580.09478: ^ task is: TASK: Show item 30575 1726867580.09483: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867580.09488: getting variables 30575 1726867580.09490: in VariableManager get_vars() 30575 1726867580.09521: Calling all_inventory to load vars for managed_node3 30575 1726867580.09524: Calling groups_inventory to load vars for managed_node3 30575 1726867580.09528: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.09538: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.09541: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.09546: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.11168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.13445: done with get_vars() 30575 1726867580.13465: done getting variables 30575 1726867580.13521: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:26:20 -0400 (0:00:00.068) 0:00:15.513 ****** 30575 1726867580.13568: entering _queue_task() for managed_node3/debug 30575 1726867580.13912: worker is 1 (out of 1 available) 30575 1726867580.13925: exiting _queue_task() for managed_node3/debug 30575 1726867580.13938: done queuing things up, now waiting for results queue to drain 30575 1726867580.13939: waiting for pending results... 30575 1726867580.14214: running TaskExecutor() for managed_node3/TASK: Show item 30575 1726867580.14307: in run() - task 0affcac9-a3a5-e081-a588-0000000005b5 30575 1726867580.14325: variable 'ansible_search_path' from source: unknown 30575 1726867580.14333: variable 'ansible_search_path' from source: unknown 30575 1726867580.14390: variable 'omit' from source: magic vars 30575 1726867580.14532: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.14545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.14558: variable 'omit' from source: magic vars 30575 1726867580.14900: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.14918: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.14931: variable 'omit' from source: magic vars 30575 1726867580.14975: variable 'omit' from source: magic vars 30575 1726867580.15023: variable 'item' from source: unknown 30575 1726867580.15103: variable 'item' from source: unknown 30575 1726867580.15125: variable 'omit' from source: magic vars 30575 1726867580.15172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867580.15266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.15270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867580.15273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.15275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.15308: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.15318: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.15325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.15429: Set connection var ansible_pipelining to False 30575 1726867580.15438: Set connection var ansible_shell_type to sh 30575 1726867580.15482: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.15486: Set connection var ansible_timeout to 10 30575 1726867580.15488: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.15490: Set connection var ansible_connection to ssh 30575 1726867580.15508: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.15516: variable 'ansible_connection' from source: unknown 30575 1726867580.15523: variable 'ansible_module_compression' from source: unknown 30575 1726867580.15530: variable 'ansible_shell_type' from source: unknown 30575 1726867580.15537: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.15682: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.15685: variable 'ansible_pipelining' from source: unknown 30575 1726867580.15687: variable 'ansible_timeout' from source: unknown 30575 1726867580.15690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.15696: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.15715: variable 'omit' from source: magic vars 30575 1726867580.15726: starting attempt loop 30575 1726867580.15733: running the handler 30575 1726867580.15783: variable 'lsr_description' from source: include params 30575 1726867580.15852: variable 'lsr_description' from source: include params 30575 1726867580.15867: handler run complete 30575 1726867580.15890: attempt loop complete, returning result 30575 1726867580.15914: variable 'item' from source: unknown 30575 1726867580.15979: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 30575 1726867580.16383: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.16386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.16388: variable 'omit' from source: magic vars 30575 1726867580.16451: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.16454: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.16456: variable 'omit' from source: magic vars 30575 1726867580.16653: variable 'omit' from source: magic vars 30575 1726867580.16657: variable 'item' from source: unknown 30575 1726867580.16886: variable 'item' from source: unknown 30575 1726867580.16907: variable 'omit' from source: magic vars 30575 1726867580.16931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.16944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.16955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.16971: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.16981: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.16989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.17063: Set connection var ansible_pipelining to False 30575 1726867580.17072: Set connection var ansible_shell_type to sh 30575 1726867580.17084: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.17093: Set connection var ansible_timeout to 10 30575 1726867580.17100: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.17183: Set connection var ansible_connection to ssh 30575 1726867580.17186: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.17189: variable 'ansible_connection' from source: unknown 30575 1726867580.17192: variable 'ansible_module_compression' from source: unknown 30575 1726867580.17194: variable 'ansible_shell_type' from source: unknown 30575 1726867580.17196: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.17198: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.17200: variable 'ansible_pipelining' from source: unknown 30575 1726867580.17202: variable 'ansible_timeout' from source: unknown 30575 1726867580.17204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.17260: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.17275: variable 'omit' from source: magic vars 30575 1726867580.17286: starting attempt loop 30575 1726867580.17293: running the handler 30575 1726867580.17316: variable 'lsr_setup' from source: include params 30575 1726867580.17386: variable 'lsr_setup' from source: include params 30575 1726867580.17432: handler run complete 30575 1726867580.17448: attempt loop complete, returning result 30575 1726867580.17464: variable 'item' from source: unknown 30575 1726867580.17526: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30575 1726867580.17709: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.17712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.17714: variable 'omit' from source: magic vars 30575 1726867580.17882: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.17886: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.17888: variable 'omit' from source: magic vars 30575 1726867580.17890: variable 'omit' from source: magic vars 30575 1726867580.17937: variable 'item' from source: unknown 30575 1726867580.17981: variable 'item' from source: unknown 30575 1726867580.18000: variable 'omit' from source: magic vars 30575 1726867580.18045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.18049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.18051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.18054: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.18061: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.18067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.18154: Set connection var ansible_pipelining to False 30575 1726867580.18157: Set connection var ansible_shell_type to sh 30575 1726867580.18160: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.18162: Set connection var ansible_timeout to 10 30575 1726867580.18172: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.18263: Set connection var ansible_connection to ssh 30575 1726867580.18266: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.18268: variable 'ansible_connection' from source: unknown 30575 1726867580.18271: variable 'ansible_module_compression' from source: unknown 30575 1726867580.18273: variable 'ansible_shell_type' from source: unknown 30575 1726867580.18275: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.18279: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.18281: variable 'ansible_pipelining' from source: unknown 30575 1726867580.18283: variable 'ansible_timeout' from source: unknown 30575 1726867580.18285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.18334: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.18346: variable 'omit' from source: magic vars 30575 1726867580.18355: starting attempt loop 30575 1726867580.18362: running the handler 30575 1726867580.18390: variable 'lsr_test' from source: include params 30575 1726867580.18451: variable 'lsr_test' from source: include params 30575 1726867580.18472: handler run complete 30575 1726867580.18497: attempt loop complete, returning result 30575 1726867580.18515: variable 'item' from source: unknown 30575 1726867580.18573: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 30575 1726867580.18742: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.18745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.18748: variable 'omit' from source: magic vars 30575 1726867580.18972: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.18975: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.18980: variable 'omit' from source: magic vars 30575 1726867580.18983: variable 'omit' from source: magic vars 30575 1726867580.18985: variable 'item' from source: unknown 30575 1726867580.19014: variable 'item' from source: unknown 30575 1726867580.19033: variable 'omit' from source: magic vars 30575 1726867580.19053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.19065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.19081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.19097: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.19105: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.19112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.19279: Set connection var ansible_pipelining to False 30575 1726867580.19284: Set connection var ansible_shell_type to sh 30575 1726867580.19286: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.19288: Set connection var ansible_timeout to 10 30575 1726867580.19290: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.19291: Set connection var ansible_connection to ssh 30575 1726867580.19293: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.19294: variable 'ansible_connection' from source: unknown 30575 1726867580.19296: variable 'ansible_module_compression' from source: unknown 30575 1726867580.19298: variable 'ansible_shell_type' from source: unknown 30575 1726867580.19299: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.19301: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.19302: variable 'ansible_pipelining' from source: unknown 30575 1726867580.19304: variable 'ansible_timeout' from source: unknown 30575 1726867580.19305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.19360: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.19684: variable 'omit' from source: magic vars 30575 1726867580.19687: starting attempt loop 30575 1726867580.19690: running the handler 30575 1726867580.19692: variable 'lsr_assert' from source: include params 30575 1726867580.19694: variable 'lsr_assert' from source: include params 30575 1726867580.19696: handler run complete 30575 1726867580.19698: attempt loop complete, returning result 30575 1726867580.19711: variable 'item' from source: unknown 30575 1726867580.19768: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 30575 1726867580.19897: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.20082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.20085: variable 'omit' from source: magic vars 30575 1726867580.20385: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.20388: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.20395: variable 'omit' from source: magic vars 30575 1726867580.20397: variable 'omit' from source: magic vars 30575 1726867580.20528: variable 'item' from source: unknown 30575 1726867580.20591: variable 'item' from source: unknown 30575 1726867580.20697: variable 'omit' from source: magic vars 30575 1726867580.20722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.20733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.20743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.20830: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.20838: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.20845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.20914: Set connection var ansible_pipelining to False 30575 1726867580.21143: Set connection var ansible_shell_type to sh 30575 1726867580.21146: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.21148: Set connection var ansible_timeout to 10 30575 1726867580.21150: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.21152: Set connection var ansible_connection to ssh 30575 1726867580.21154: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.21156: variable 'ansible_connection' from source: unknown 30575 1726867580.21158: variable 'ansible_module_compression' from source: unknown 30575 1726867580.21160: variable 'ansible_shell_type' from source: unknown 30575 1726867580.21162: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.21163: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.21165: variable 'ansible_pipelining' from source: unknown 30575 1726867580.21167: variable 'ansible_timeout' from source: unknown 30575 1726867580.21169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.21297: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.21309: variable 'omit' from source: magic vars 30575 1726867580.21316: starting attempt loop 30575 1726867580.21324: running the handler 30575 1726867580.21580: handler run complete 30575 1726867580.21597: attempt loop complete, returning result 30575 1726867580.21614: variable 'item' from source: unknown 30575 1726867580.21674: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30575 1726867580.21921: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.21933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.22115: variable 'omit' from source: magic vars 30575 1726867580.22268: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.22280: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.22288: variable 'omit' from source: magic vars 30575 1726867580.22305: variable 'omit' from source: magic vars 30575 1726867580.22371: variable 'item' from source: unknown 30575 1726867580.22502: variable 'item' from source: unknown 30575 1726867580.22565: variable 'omit' from source: magic vars 30575 1726867580.22802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.22910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.22913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.22916: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.22918: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.22920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.22921: Set connection var ansible_pipelining to False 30575 1726867580.22923: Set connection var ansible_shell_type to sh 30575 1726867580.22925: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.22934: Set connection var ansible_timeout to 10 30575 1726867580.22942: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.22952: Set connection var ansible_connection to ssh 30575 1726867580.22975: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.22986: variable 'ansible_connection' from source: unknown 30575 1726867580.22994: variable 'ansible_module_compression' from source: unknown 30575 1726867580.23000: variable 'ansible_shell_type' from source: unknown 30575 1726867580.23006: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.23014: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.23023: variable 'ansible_pipelining' from source: unknown 30575 1726867580.23029: variable 'ansible_timeout' from source: unknown 30575 1726867580.23036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.23118: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.23135: variable 'omit' from source: magic vars 30575 1726867580.23143: starting attempt loop 30575 1726867580.23151: running the handler 30575 1726867580.23171: variable 'lsr_fail_debug' from source: play vars 30575 1726867580.23238: variable 'lsr_fail_debug' from source: play vars 30575 1726867580.23258: handler run complete 30575 1726867580.23275: attempt loop complete, returning result 30575 1726867580.23296: variable 'item' from source: unknown 30575 1726867580.23357: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30575 1726867580.23564: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.23568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.23570: variable 'omit' from source: magic vars 30575 1726867580.23659: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.23676: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.23784: variable 'omit' from source: magic vars 30575 1726867580.23788: variable 'omit' from source: magic vars 30575 1726867580.23790: variable 'item' from source: unknown 30575 1726867580.23805: variable 'item' from source: unknown 30575 1726867580.23824: variable 'omit' from source: magic vars 30575 1726867580.23848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.23859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.23870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.23894: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.23902: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.23910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.23974: Set connection var ansible_pipelining to False 30575 1726867580.23983: Set connection var ansible_shell_type to sh 30575 1726867580.23993: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.24008: Set connection var ansible_timeout to 10 30575 1726867580.24018: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.24029: Set connection var ansible_connection to ssh 30575 1726867580.24052: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.24060: variable 'ansible_connection' from source: unknown 30575 1726867580.24066: variable 'ansible_module_compression' from source: unknown 30575 1726867580.24073: variable 'ansible_shell_type' from source: unknown 30575 1726867580.24081: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.24109: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.24112: variable 'ansible_pipelining' from source: unknown 30575 1726867580.24114: variable 'ansible_timeout' from source: unknown 30575 1726867580.24116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.24187: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.24199: variable 'omit' from source: magic vars 30575 1726867580.24217: starting attempt loop 30575 1726867580.24220: running the handler 30575 1726867580.24283: variable 'lsr_cleanup' from source: include params 30575 1726867580.24293: variable 'lsr_cleanup' from source: include params 30575 1726867580.24312: handler run complete 30575 1726867580.24334: attempt loop complete, returning result 30575 1726867580.24352: variable 'item' from source: unknown 30575 1726867580.24410: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30575 1726867580.24595: dumping result to json 30575 1726867580.24598: done dumping result, returning 30575 1726867580.24601: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-e081-a588-0000000005b5] 30575 1726867580.24603: sending task result for task 0affcac9-a3a5-e081-a588-0000000005b5 30575 1726867580.24649: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005b5 30575 1726867580.24652: WORKER PROCESS EXITING 30575 1726867580.24750: no more pending results, returning what we have 30575 1726867580.24754: results queue empty 30575 1726867580.24755: checking for any_errors_fatal 30575 1726867580.24762: done checking for any_errors_fatal 30575 1726867580.24763: checking for max_fail_percentage 30575 1726867580.24765: done checking for max_fail_percentage 30575 1726867580.24766: checking to see if all hosts have failed and the running result is not ok 30575 1726867580.24767: done checking to see if all hosts have failed 30575 1726867580.24768: getting the remaining hosts for this loop 30575 1726867580.24770: done getting the remaining hosts for this loop 30575 1726867580.24774: getting the next task for host managed_node3 30575 1726867580.24784: done getting next task for host managed_node3 30575 1726867580.24787: ^ task is: TASK: Include the task 'show_interfaces.yml' 30575 1726867580.24790: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867580.24794: getting variables 30575 1726867580.24796: in VariableManager get_vars() 30575 1726867580.24829: Calling all_inventory to load vars for managed_node3 30575 1726867580.24831: Calling groups_inventory to load vars for managed_node3 30575 1726867580.24835: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.24846: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.24849: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.24853: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.27882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.30039: done with get_vars() 30575 1726867580.30063: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:26:20 -0400 (0:00:00.165) 0:00:15.679 ****** 30575 1726867580.30154: entering _queue_task() for managed_node3/include_tasks 30575 1726867580.30683: worker is 1 (out of 1 available) 30575 1726867580.30694: exiting _queue_task() for managed_node3/include_tasks 30575 1726867580.30707: done queuing things up, now waiting for results queue to drain 30575 1726867580.30708: waiting for pending results... 30575 1726867580.31595: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30575 1726867580.31603: in run() - task 0affcac9-a3a5-e081-a588-0000000005b6 30575 1726867580.31607: variable 'ansible_search_path' from source: unknown 30575 1726867580.31609: variable 'ansible_search_path' from source: unknown 30575 1726867580.31612: calling self._execute() 30575 1726867580.31674: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.31944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.31947: variable 'omit' from source: magic vars 30575 1726867580.32538: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.32555: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.32600: _execute() done 30575 1726867580.32611: dumping result to json 30575 1726867580.32620: done dumping result, returning 30575 1726867580.32700: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-e081-a588-0000000005b6] 30575 1726867580.32706: sending task result for task 0affcac9-a3a5-e081-a588-0000000005b6 30575 1726867580.32910: no more pending results, returning what we have 30575 1726867580.32916: in VariableManager get_vars() 30575 1726867580.32956: Calling all_inventory to load vars for managed_node3 30575 1726867580.32959: Calling groups_inventory to load vars for managed_node3 30575 1726867580.32963: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.32979: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.32983: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.32987: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.33883: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005b6 30575 1726867580.33886: WORKER PROCESS EXITING 30575 1726867580.36283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.39256: done with get_vars() 30575 1726867580.39286: variable 'ansible_search_path' from source: unknown 30575 1726867580.39287: variable 'ansible_search_path' from source: unknown 30575 1726867580.39324: we have included files to process 30575 1726867580.39325: generating all_blocks data 30575 1726867580.39327: done generating all_blocks data 30575 1726867580.39330: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867580.39331: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867580.39333: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867580.39434: in VariableManager get_vars() 30575 1726867580.39453: done with get_vars() 30575 1726867580.39558: done processing included file 30575 1726867580.39560: iterating over new_blocks loaded from include file 30575 1726867580.39561: in VariableManager get_vars() 30575 1726867580.39573: done with get_vars() 30575 1726867580.39574: filtering new block on tags 30575 1726867580.39604: done filtering new block on tags 30575 1726867580.39607: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30575 1726867580.39612: extending task lists for all hosts with included blocks 30575 1726867580.40027: done extending task lists 30575 1726867580.40028: done processing included files 30575 1726867580.40029: results queue empty 30575 1726867580.40030: checking for any_errors_fatal 30575 1726867580.40036: done checking for any_errors_fatal 30575 1726867580.40037: checking for max_fail_percentage 30575 1726867580.40038: done checking for max_fail_percentage 30575 1726867580.40038: checking to see if all hosts have failed and the running result is not ok 30575 1726867580.40039: done checking to see if all hosts have failed 30575 1726867580.40040: getting the remaining hosts for this loop 30575 1726867580.40041: done getting the remaining hosts for this loop 30575 1726867580.40044: getting the next task for host managed_node3 30575 1726867580.40048: done getting next task for host managed_node3 30575 1726867580.40050: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30575 1726867580.40053: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867580.40056: getting variables 30575 1726867580.40057: in VariableManager get_vars() 30575 1726867580.40066: Calling all_inventory to load vars for managed_node3 30575 1726867580.40068: Calling groups_inventory to load vars for managed_node3 30575 1726867580.40071: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.40076: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.40081: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.40084: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.41196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.42679: done with get_vars() 30575 1726867580.42698: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:26:20 -0400 (0:00:00.126) 0:00:15.805 ****** 30575 1726867580.42766: entering _queue_task() for managed_node3/include_tasks 30575 1726867580.43102: worker is 1 (out of 1 available) 30575 1726867580.43114: exiting _queue_task() for managed_node3/include_tasks 30575 1726867580.43129: done queuing things up, now waiting for results queue to drain 30575 1726867580.43131: waiting for pending results... 30575 1726867580.43409: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30575 1726867580.43524: in run() - task 0affcac9-a3a5-e081-a588-0000000005dd 30575 1726867580.43546: variable 'ansible_search_path' from source: unknown 30575 1726867580.43553: variable 'ansible_search_path' from source: unknown 30575 1726867580.43603: calling self._execute() 30575 1726867580.43690: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.43703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.43783: variable 'omit' from source: magic vars 30575 1726867580.44076: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.44095: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.44105: _execute() done 30575 1726867580.44113: dumping result to json 30575 1726867580.44120: done dumping result, returning 30575 1726867580.44131: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-e081-a588-0000000005dd] 30575 1726867580.44145: sending task result for task 0affcac9-a3a5-e081-a588-0000000005dd 30575 1726867580.44501: no more pending results, returning what we have 30575 1726867580.44505: in VariableManager get_vars() 30575 1726867580.44536: Calling all_inventory to load vars for managed_node3 30575 1726867580.44538: Calling groups_inventory to load vars for managed_node3 30575 1726867580.44541: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.44551: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.44554: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.44557: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.45190: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005dd 30575 1726867580.45193: WORKER PROCESS EXITING 30575 1726867580.45967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.47427: done with get_vars() 30575 1726867580.47445: variable 'ansible_search_path' from source: unknown 30575 1726867580.47446: variable 'ansible_search_path' from source: unknown 30575 1726867580.47480: we have included files to process 30575 1726867580.47481: generating all_blocks data 30575 1726867580.47483: done generating all_blocks data 30575 1726867580.47484: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867580.47485: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867580.47487: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867580.47710: done processing included file 30575 1726867580.47712: iterating over new_blocks loaded from include file 30575 1726867580.47714: in VariableManager get_vars() 30575 1726867580.47728: done with get_vars() 30575 1726867580.47730: filtering new block on tags 30575 1726867580.47764: done filtering new block on tags 30575 1726867580.47766: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30575 1726867580.47771: extending task lists for all hosts with included blocks 30575 1726867580.48127: done extending task lists 30575 1726867580.48128: done processing included files 30575 1726867580.48129: results queue empty 30575 1726867580.48129: checking for any_errors_fatal 30575 1726867580.48132: done checking for any_errors_fatal 30575 1726867580.48133: checking for max_fail_percentage 30575 1726867580.48134: done checking for max_fail_percentage 30575 1726867580.48135: checking to see if all hosts have failed and the running result is not ok 30575 1726867580.48136: done checking to see if all hosts have failed 30575 1726867580.48136: getting the remaining hosts for this loop 30575 1726867580.48137: done getting the remaining hosts for this loop 30575 1726867580.48140: getting the next task for host managed_node3 30575 1726867580.48144: done getting next task for host managed_node3 30575 1726867580.48146: ^ task is: TASK: Gather current interface info 30575 1726867580.48150: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867580.48152: getting variables 30575 1726867580.48153: in VariableManager get_vars() 30575 1726867580.48162: Calling all_inventory to load vars for managed_node3 30575 1726867580.48165: Calling groups_inventory to load vars for managed_node3 30575 1726867580.48167: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.48172: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.48174: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.48381: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.50531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.52109: done with get_vars() 30575 1726867580.52137: done getting variables 30575 1726867580.52187: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:26:20 -0400 (0:00:00.094) 0:00:15.899 ****** 30575 1726867580.52220: entering _queue_task() for managed_node3/command 30575 1726867580.52811: worker is 1 (out of 1 available) 30575 1726867580.52820: exiting _queue_task() for managed_node3/command 30575 1726867580.52831: done queuing things up, now waiting for results queue to drain 30575 1726867580.52833: waiting for pending results... 30575 1726867580.52900: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30575 1726867580.53040: in run() - task 0affcac9-a3a5-e081-a588-000000000618 30575 1726867580.53065: variable 'ansible_search_path' from source: unknown 30575 1726867580.53073: variable 'ansible_search_path' from source: unknown 30575 1726867580.53121: calling self._execute() 30575 1726867580.53222: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.53233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.53248: variable 'omit' from source: magic vars 30575 1726867580.53624: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.53641: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.53651: variable 'omit' from source: magic vars 30575 1726867580.53711: variable 'omit' from source: magic vars 30575 1726867580.53753: variable 'omit' from source: magic vars 30575 1726867580.53798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867580.53846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867580.53872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867580.53898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.53916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867580.53955: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867580.53964: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.53971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.54079: Set connection var ansible_pipelining to False 30575 1726867580.54089: Set connection var ansible_shell_type to sh 30575 1726867580.54101: Set connection var ansible_shell_executable to /bin/sh 30575 1726867580.54111: Set connection var ansible_timeout to 10 30575 1726867580.54120: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867580.54131: Set connection var ansible_connection to ssh 30575 1726867580.54161: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.54169: variable 'ansible_connection' from source: unknown 30575 1726867580.54176: variable 'ansible_module_compression' from source: unknown 30575 1726867580.54184: variable 'ansible_shell_type' from source: unknown 30575 1726867580.54190: variable 'ansible_shell_executable' from source: unknown 30575 1726867580.54196: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.54202: variable 'ansible_pipelining' from source: unknown 30575 1726867580.54208: variable 'ansible_timeout' from source: unknown 30575 1726867580.54215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.54364: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867580.54383: variable 'omit' from source: magic vars 30575 1726867580.54393: starting attempt loop 30575 1726867580.54399: running the handler 30575 1726867580.54416: _low_level_execute_command(): starting 30575 1726867580.54428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867580.55157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867580.55174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867580.55192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867580.55211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867580.55315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867580.55787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867580.55891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867580.57562: stdout chunk (state=3): >>>/root <<< 30575 1726867580.57696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867580.57744: stderr chunk (state=3): >>><<< 30575 1726867580.57758: stdout chunk (state=3): >>><<< 30575 1726867580.57947: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867580.57950: _low_level_execute_command(): starting 30575 1726867580.57954: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485 `" && echo ansible-tmp-1726867580.5784874-31263-132799264264485="` echo /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485 `" ) && sleep 0' 30575 1726867580.59597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867580.59609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867580.59625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867580.59723: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867580.59883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867580.59956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867580.61826: stdout chunk (state=3): >>>ansible-tmp-1726867580.5784874-31263-132799264264485=/root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485 <<< 30575 1726867580.62001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867580.62005: stdout chunk (state=3): >>><<< 30575 1726867580.62012: stderr chunk (state=3): >>><<< 30575 1726867580.62041: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867580.5784874-31263-132799264264485=/root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867580.62072: variable 'ansible_module_compression' from source: unknown 30575 1726867580.62125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867580.62169: variable 'ansible_facts' from source: unknown 30575 1726867580.62458: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/AnsiballZ_command.py 30575 1726867580.62726: Sending initial data 30575 1726867580.62733: Sent initial data (156 bytes) 30575 1726867580.64132: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867580.64217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867580.64326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867580.64330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867580.64473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867580.66007: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867580.66050: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867580.66104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpq3hao5mu /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/AnsiballZ_command.py <<< 30575 1726867580.66115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/AnsiballZ_command.py" <<< 30575 1726867580.66157: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpq3hao5mu" to remote "/root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/AnsiballZ_command.py" <<< 30575 1726867580.67548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867580.67559: stdout chunk (state=3): >>><<< 30575 1726867580.67572: stderr chunk (state=3): >>><<< 30575 1726867580.67785: done transferring module to remote 30575 1726867580.67788: _low_level_execute_command(): starting 30575 1726867580.67790: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/ /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/AnsiballZ_command.py && sleep 0' 30575 1726867580.69030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867580.69165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867580.69251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867580.69292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867580.71126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867580.71383: stderr chunk (state=3): >>><<< 30575 1726867580.71387: stdout chunk (state=3): >>><<< 30575 1726867580.71389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867580.71399: _low_level_execute_command(): starting 30575 1726867580.71405: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/AnsiballZ_command.py && sleep 0' 30575 1726867580.73093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867580.73223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867580.73242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867580.73388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867580.88849: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:20.882875", "end": "2024-09-20 17:26:20.886188", "delta": "0:00:00.003313", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867580.90360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867580.90365: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867580.90370: stdout chunk (state=3): >>><<< 30575 1726867580.90526: stderr chunk (state=3): >>><<< 30575 1726867580.90531: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:20.882875", "end": "2024-09-20 17:26:20.886188", "delta": "0:00:00.003313", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867580.90534: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867580.90537: _low_level_execute_command(): starting 30575 1726867580.90539: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867580.5784874-31263-132799264264485/ > /dev/null 2>&1 && sleep 0' 30575 1726867580.91073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867580.91159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867580.91165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867580.91168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867580.91179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867580.91182: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867580.91184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867580.91186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867580.91188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867580.91190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867580.91192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867580.91194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867580.91320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867580.91324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867580.91326: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867580.91328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867580.91538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867580.91542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867580.91591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867580.93457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867580.93461: stdout chunk (state=3): >>><<< 30575 1726867580.93464: stderr chunk (state=3): >>><<< 30575 1726867580.93490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867580.93497: handler run complete 30575 1726867580.93684: Evaluated conditional (False): False 30575 1726867580.93688: attempt loop complete, returning result 30575 1726867580.93691: _execute() done 30575 1726867580.93693: dumping result to json 30575 1726867580.93696: done dumping result, returning 30575 1726867580.93698: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-e081-a588-000000000618] 30575 1726867580.93700: sending task result for task 0affcac9-a3a5-e081-a588-000000000618 30575 1726867580.93765: done sending task result for task 0affcac9-a3a5-e081-a588-000000000618 30575 1726867580.93769: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003313", "end": "2024-09-20 17:26:20.886188", "rc": 0, "start": "2024-09-20 17:26:20.882875" } STDOUT: bonding_masters eth0 lo 30575 1726867580.93859: no more pending results, returning what we have 30575 1726867580.93863: results queue empty 30575 1726867580.93864: checking for any_errors_fatal 30575 1726867580.93866: done checking for any_errors_fatal 30575 1726867580.93866: checking for max_fail_percentage 30575 1726867580.93868: done checking for max_fail_percentage 30575 1726867580.93869: checking to see if all hosts have failed and the running result is not ok 30575 1726867580.93870: done checking to see if all hosts have failed 30575 1726867580.93871: getting the remaining hosts for this loop 30575 1726867580.93872: done getting the remaining hosts for this loop 30575 1726867580.93880: getting the next task for host managed_node3 30575 1726867580.93888: done getting next task for host managed_node3 30575 1726867580.93891: ^ task is: TASK: Set current_interfaces 30575 1726867580.93897: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867580.93903: getting variables 30575 1726867580.93905: in VariableManager get_vars() 30575 1726867580.93945: Calling all_inventory to load vars for managed_node3 30575 1726867580.93948: Calling groups_inventory to load vars for managed_node3 30575 1726867580.93952: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867580.93964: Calling all_plugins_play to load vars for managed_node3 30575 1726867580.93967: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867580.93970: Calling groups_plugins_play to load vars for managed_node3 30575 1726867580.95888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867580.97998: done with get_vars() 30575 1726867580.98027: done getting variables 30575 1726867580.98085: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:26:20 -0400 (0:00:00.459) 0:00:16.358 ****** 30575 1726867580.98126: entering _queue_task() for managed_node3/set_fact 30575 1726867580.98573: worker is 1 (out of 1 available) 30575 1726867580.98597: exiting _queue_task() for managed_node3/set_fact 30575 1726867580.98608: done queuing things up, now waiting for results queue to drain 30575 1726867580.98610: waiting for pending results... 30575 1726867580.98842: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30575 1726867580.99284: in run() - task 0affcac9-a3a5-e081-a588-000000000619 30575 1726867580.99288: variable 'ansible_search_path' from source: unknown 30575 1726867580.99291: variable 'ansible_search_path' from source: unknown 30575 1726867580.99294: calling self._execute() 30575 1726867580.99297: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867580.99300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867580.99303: variable 'omit' from source: magic vars 30575 1726867580.99630: variable 'ansible_distribution_major_version' from source: facts 30575 1726867580.99758: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867580.99764: variable 'omit' from source: magic vars 30575 1726867581.00083: variable 'omit' from source: magic vars 30575 1726867581.00240: variable '_current_interfaces' from source: set_fact 30575 1726867581.00306: variable 'omit' from source: magic vars 30575 1726867581.00583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867581.00586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867581.00589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867581.00595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.00699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.00774: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867581.00779: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.00783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.01012: Set connection var ansible_pipelining to False 30575 1726867581.01016: Set connection var ansible_shell_type to sh 30575 1726867581.01022: Set connection var ansible_shell_executable to /bin/sh 30575 1726867581.01027: Set connection var ansible_timeout to 10 30575 1726867581.01032: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867581.01164: Set connection var ansible_connection to ssh 30575 1726867581.01206: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.01209: variable 'ansible_connection' from source: unknown 30575 1726867581.01212: variable 'ansible_module_compression' from source: unknown 30575 1726867581.01214: variable 'ansible_shell_type' from source: unknown 30575 1726867581.01216: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.01218: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.01221: variable 'ansible_pipelining' from source: unknown 30575 1726867581.01225: variable 'ansible_timeout' from source: unknown 30575 1726867581.01228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.01567: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867581.01584: variable 'omit' from source: magic vars 30575 1726867581.01795: starting attempt loop 30575 1726867581.01800: running the handler 30575 1726867581.01803: handler run complete 30575 1726867581.01806: attempt loop complete, returning result 30575 1726867581.01809: _execute() done 30575 1726867581.01811: dumping result to json 30575 1726867581.01814: done dumping result, returning 30575 1726867581.01831: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-e081-a588-000000000619] 30575 1726867581.01834: sending task result for task 0affcac9-a3a5-e081-a588-000000000619 30575 1726867581.02109: done sending task result for task 0affcac9-a3a5-e081-a588-000000000619 30575 1726867581.02113: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30575 1726867581.02186: no more pending results, returning what we have 30575 1726867581.02190: results queue empty 30575 1726867581.02191: checking for any_errors_fatal 30575 1726867581.02203: done checking for any_errors_fatal 30575 1726867581.02204: checking for max_fail_percentage 30575 1726867581.02206: done checking for max_fail_percentage 30575 1726867581.02207: checking to see if all hosts have failed and the running result is not ok 30575 1726867581.02208: done checking to see if all hosts have failed 30575 1726867581.02209: getting the remaining hosts for this loop 30575 1726867581.02210: done getting the remaining hosts for this loop 30575 1726867581.02215: getting the next task for host managed_node3 30575 1726867581.02227: done getting next task for host managed_node3 30575 1726867581.02230: ^ task is: TASK: Show current_interfaces 30575 1726867581.02234: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867581.02240: getting variables 30575 1726867581.02242: in VariableManager get_vars() 30575 1726867581.02501: Calling all_inventory to load vars for managed_node3 30575 1726867581.02504: Calling groups_inventory to load vars for managed_node3 30575 1726867581.02508: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867581.02521: Calling all_plugins_play to load vars for managed_node3 30575 1726867581.02526: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867581.02530: Calling groups_plugins_play to load vars for managed_node3 30575 1726867581.06498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867581.10350: done with get_vars() 30575 1726867581.10399: done getting variables 30575 1726867581.10482: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:26:21 -0400 (0:00:00.123) 0:00:16.482 ****** 30575 1726867581.10515: entering _queue_task() for managed_node3/debug 30575 1726867581.10999: worker is 1 (out of 1 available) 30575 1726867581.11020: exiting _queue_task() for managed_node3/debug 30575 1726867581.11033: done queuing things up, now waiting for results queue to drain 30575 1726867581.11036: waiting for pending results... 30575 1726867581.11496: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30575 1726867581.11502: in run() - task 0affcac9-a3a5-e081-a588-0000000005de 30575 1726867581.11506: variable 'ansible_search_path' from source: unknown 30575 1726867581.11511: variable 'ansible_search_path' from source: unknown 30575 1726867581.11515: calling self._execute() 30575 1726867581.11608: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.11612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.11884: variable 'omit' from source: magic vars 30575 1726867581.12034: variable 'ansible_distribution_major_version' from source: facts 30575 1726867581.12087: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867581.12090: variable 'omit' from source: magic vars 30575 1726867581.12103: variable 'omit' from source: magic vars 30575 1726867581.12203: variable 'current_interfaces' from source: set_fact 30575 1726867581.12242: variable 'omit' from source: magic vars 30575 1726867581.12282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867581.12318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867581.12348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867581.12366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.12379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.12412: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867581.12415: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.12419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.12682: Set connection var ansible_pipelining to False 30575 1726867581.12686: Set connection var ansible_shell_type to sh 30575 1726867581.12688: Set connection var ansible_shell_executable to /bin/sh 30575 1726867581.12691: Set connection var ansible_timeout to 10 30575 1726867581.12693: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867581.12695: Set connection var ansible_connection to ssh 30575 1726867581.12697: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.12700: variable 'ansible_connection' from source: unknown 30575 1726867581.12702: variable 'ansible_module_compression' from source: unknown 30575 1726867581.12704: variable 'ansible_shell_type' from source: unknown 30575 1726867581.12706: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.12708: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.12710: variable 'ansible_pipelining' from source: unknown 30575 1726867581.12712: variable 'ansible_timeout' from source: unknown 30575 1726867581.12714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.12746: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867581.12757: variable 'omit' from source: magic vars 30575 1726867581.12772: starting attempt loop 30575 1726867581.12775: running the handler 30575 1726867581.12823: handler run complete 30575 1726867581.12840: attempt loop complete, returning result 30575 1726867581.12843: _execute() done 30575 1726867581.12845: dumping result to json 30575 1726867581.12848: done dumping result, returning 30575 1726867581.12857: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-e081-a588-0000000005de] 30575 1726867581.12860: sending task result for task 0affcac9-a3a5-e081-a588-0000000005de 30575 1726867581.12955: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005de 30575 1726867581.12958: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30575 1726867581.13035: no more pending results, returning what we have 30575 1726867581.13039: results queue empty 30575 1726867581.13040: checking for any_errors_fatal 30575 1726867581.13046: done checking for any_errors_fatal 30575 1726867581.13047: checking for max_fail_percentage 30575 1726867581.13049: done checking for max_fail_percentage 30575 1726867581.13049: checking to see if all hosts have failed and the running result is not ok 30575 1726867581.13050: done checking to see if all hosts have failed 30575 1726867581.13051: getting the remaining hosts for this loop 30575 1726867581.13052: done getting the remaining hosts for this loop 30575 1726867581.13057: getting the next task for host managed_node3 30575 1726867581.13066: done getting next task for host managed_node3 30575 1726867581.13069: ^ task is: TASK: Setup 30575 1726867581.13072: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867581.13080: getting variables 30575 1726867581.13082: in VariableManager get_vars() 30575 1726867581.13306: Calling all_inventory to load vars for managed_node3 30575 1726867581.13309: Calling groups_inventory to load vars for managed_node3 30575 1726867581.13312: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867581.13322: Calling all_plugins_play to load vars for managed_node3 30575 1726867581.13327: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867581.13330: Calling groups_plugins_play to load vars for managed_node3 30575 1726867581.14722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867581.16775: done with get_vars() 30575 1726867581.16803: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:26:21 -0400 (0:00:00.065) 0:00:16.547 ****** 30575 1726867581.17020: entering _queue_task() for managed_node3/include_tasks 30575 1726867581.17449: worker is 1 (out of 1 available) 30575 1726867581.17463: exiting _queue_task() for managed_node3/include_tasks 30575 1726867581.17476: done queuing things up, now waiting for results queue to drain 30575 1726867581.17480: waiting for pending results... 30575 1726867581.17789: running TaskExecutor() for managed_node3/TASK: Setup 30575 1726867581.17882: in run() - task 0affcac9-a3a5-e081-a588-0000000005b7 30575 1726867581.18017: variable 'ansible_search_path' from source: unknown 30575 1726867581.18021: variable 'ansible_search_path' from source: unknown 30575 1726867581.18024: variable 'lsr_setup' from source: include params 30575 1726867581.18159: variable 'lsr_setup' from source: include params 30575 1726867581.18227: variable 'omit' from source: magic vars 30575 1726867581.18362: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.18370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.18381: variable 'omit' from source: magic vars 30575 1726867581.18624: variable 'ansible_distribution_major_version' from source: facts 30575 1726867581.18636: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867581.18642: variable 'item' from source: unknown 30575 1726867581.18710: variable 'item' from source: unknown 30575 1726867581.18797: variable 'item' from source: unknown 30575 1726867581.18813: variable 'item' from source: unknown 30575 1726867581.18969: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.18973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.18976: variable 'omit' from source: magic vars 30575 1726867581.19385: variable 'ansible_distribution_major_version' from source: facts 30575 1726867581.19389: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867581.19391: variable 'item' from source: unknown 30575 1726867581.19393: variable 'item' from source: unknown 30575 1726867581.19395: variable 'item' from source: unknown 30575 1726867581.19397: variable 'item' from source: unknown 30575 1726867581.19449: dumping result to json 30575 1726867581.19452: done dumping result, returning 30575 1726867581.19454: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-e081-a588-0000000005b7] 30575 1726867581.19457: sending task result for task 0affcac9-a3a5-e081-a588-0000000005b7 30575 1726867581.19708: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005b7 30575 1726867581.19711: WORKER PROCESS EXITING 30575 1726867581.19735: no more pending results, returning what we have 30575 1726867581.19739: in VariableManager get_vars() 30575 1726867581.19774: Calling all_inventory to load vars for managed_node3 30575 1726867581.19779: Calling groups_inventory to load vars for managed_node3 30575 1726867581.19783: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867581.19796: Calling all_plugins_play to load vars for managed_node3 30575 1726867581.19799: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867581.19802: Calling groups_plugins_play to load vars for managed_node3 30575 1726867581.21458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867581.23521: done with get_vars() 30575 1726867581.23544: variable 'ansible_search_path' from source: unknown 30575 1726867581.23545: variable 'ansible_search_path' from source: unknown 30575 1726867581.23609: variable 'ansible_search_path' from source: unknown 30575 1726867581.23610: variable 'ansible_search_path' from source: unknown 30575 1726867581.23646: we have included files to process 30575 1726867581.23647: generating all_blocks data 30575 1726867581.23649: done generating all_blocks data 30575 1726867581.23653: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30575 1726867581.23654: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30575 1726867581.23656: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30575 1726867581.23855: done processing included file 30575 1726867581.23857: iterating over new_blocks loaded from include file 30575 1726867581.23859: in VariableManager get_vars() 30575 1726867581.23873: done with get_vars() 30575 1726867581.23875: filtering new block on tags 30575 1726867581.23899: done filtering new block on tags 30575 1726867581.23902: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 => (item=tasks/delete_interface.yml) 30575 1726867581.23906: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867581.23907: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867581.23910: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867581.24008: in VariableManager get_vars() 30575 1726867581.24037: done with get_vars() 30575 1726867581.24122: done processing included file 30575 1726867581.24126: iterating over new_blocks loaded from include file 30575 1726867581.24128: in VariableManager get_vars() 30575 1726867581.24147: done with get_vars() 30575 1726867581.24148: filtering new block on tags 30575 1726867581.24179: done filtering new block on tags 30575 1726867581.24181: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item=tasks/assert_device_absent.yml) 30575 1726867581.24185: extending task lists for all hosts with included blocks 30575 1726867581.24687: done extending task lists 30575 1726867581.24688: done processing included files 30575 1726867581.24689: results queue empty 30575 1726867581.24689: checking for any_errors_fatal 30575 1726867581.24691: done checking for any_errors_fatal 30575 1726867581.24692: checking for max_fail_percentage 30575 1726867581.24692: done checking for max_fail_percentage 30575 1726867581.24693: checking to see if all hosts have failed and the running result is not ok 30575 1726867581.24693: done checking to see if all hosts have failed 30575 1726867581.24694: getting the remaining hosts for this loop 30575 1726867581.24695: done getting the remaining hosts for this loop 30575 1726867581.24696: getting the next task for host managed_node3 30575 1726867581.24699: done getting next task for host managed_node3 30575 1726867581.24700: ^ task is: TASK: Remove test interface if necessary 30575 1726867581.24702: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867581.24704: getting variables 30575 1726867581.24705: in VariableManager get_vars() 30575 1726867581.24714: Calling all_inventory to load vars for managed_node3 30575 1726867581.24716: Calling groups_inventory to load vars for managed_node3 30575 1726867581.24717: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867581.24721: Calling all_plugins_play to load vars for managed_node3 30575 1726867581.24722: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867581.24725: Calling groups_plugins_play to load vars for managed_node3 30575 1726867581.25387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867581.26464: done with get_vars() 30575 1726867581.26484: done getting variables 30575 1726867581.26520: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 17:26:21 -0400 (0:00:00.095) 0:00:16.643 ****** 30575 1726867581.26545: entering _queue_task() for managed_node3/command 30575 1726867581.26880: worker is 1 (out of 1 available) 30575 1726867581.26893: exiting _queue_task() for managed_node3/command 30575 1726867581.26904: done queuing things up, now waiting for results queue to drain 30575 1726867581.26906: waiting for pending results... 30575 1726867581.27154: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 30575 1726867581.27397: in run() - task 0affcac9-a3a5-e081-a588-00000000063e 30575 1726867581.27401: variable 'ansible_search_path' from source: unknown 30575 1726867581.27404: variable 'ansible_search_path' from source: unknown 30575 1726867581.27407: calling self._execute() 30575 1726867581.27410: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.27412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.27414: variable 'omit' from source: magic vars 30575 1726867581.27810: variable 'ansible_distribution_major_version' from source: facts 30575 1726867581.27814: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867581.27817: variable 'omit' from source: magic vars 30575 1726867581.27826: variable 'omit' from source: magic vars 30575 1726867581.27983: variable 'interface' from source: play vars 30575 1726867581.27987: variable 'omit' from source: magic vars 30575 1726867581.27989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867581.27993: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867581.28013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867581.28031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.28045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.28075: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867581.28080: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.28083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.28170: Set connection var ansible_pipelining to False 30575 1726867581.28173: Set connection var ansible_shell_type to sh 30575 1726867581.28179: Set connection var ansible_shell_executable to /bin/sh 30575 1726867581.28185: Set connection var ansible_timeout to 10 30575 1726867581.28191: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867581.28198: Set connection var ansible_connection to ssh 30575 1726867581.28226: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.28231: variable 'ansible_connection' from source: unknown 30575 1726867581.28233: variable 'ansible_module_compression' from source: unknown 30575 1726867581.28235: variable 'ansible_shell_type' from source: unknown 30575 1726867581.28237: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.28354: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.28358: variable 'ansible_pipelining' from source: unknown 30575 1726867581.28361: variable 'ansible_timeout' from source: unknown 30575 1726867581.28364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.28367: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867581.28370: variable 'omit' from source: magic vars 30575 1726867581.28485: starting attempt loop 30575 1726867581.28489: running the handler 30575 1726867581.28492: _low_level_execute_command(): starting 30575 1726867581.28494: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867581.29062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867581.29092: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.29136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867581.29156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.29206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.30914: stdout chunk (state=3): >>>/root <<< 30575 1726867581.31040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.31043: stdout chunk (state=3): >>><<< 30575 1726867581.31045: stderr chunk (state=3): >>><<< 30575 1726867581.31149: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867581.31152: _low_level_execute_command(): starting 30575 1726867581.31163: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694 `" && echo ansible-tmp-1726867581.3106835-31336-96003909789694="` echo /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694 `" ) && sleep 0' 30575 1726867581.31660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867581.31663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867581.31665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.31669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867581.31680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.31729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.31733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.31782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.33662: stdout chunk (state=3): >>>ansible-tmp-1726867581.3106835-31336-96003909789694=/root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694 <<< 30575 1726867581.33762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.33789: stderr chunk (state=3): >>><<< 30575 1726867581.33793: stdout chunk (state=3): >>><<< 30575 1726867581.33806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867581.3106835-31336-96003909789694=/root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867581.33830: variable 'ansible_module_compression' from source: unknown 30575 1726867581.33868: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867581.33903: variable 'ansible_facts' from source: unknown 30575 1726867581.33951: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/AnsiballZ_command.py 30575 1726867581.34052: Sending initial data 30575 1726867581.34056: Sent initial data (155 bytes) 30575 1726867581.34491: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.34497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867581.34510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.34513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.34574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.36107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867581.36153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867581.36196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpdrskjhjr /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/AnsiballZ_command.py <<< 30575 1726867581.36199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/AnsiballZ_command.py" <<< 30575 1726867581.36238: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpdrskjhjr" to remote "/root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/AnsiballZ_command.py" <<< 30575 1726867581.36833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.36867: stderr chunk (state=3): >>><<< 30575 1726867581.36871: stdout chunk (state=3): >>><<< 30575 1726867581.36919: done transferring module to remote 30575 1726867581.36929: _low_level_execute_command(): starting 30575 1726867581.36934: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/ /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/AnsiballZ_command.py && sleep 0' 30575 1726867581.37364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867581.37367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867581.37369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.37371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867581.37373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.37409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.37421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.37489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.39189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.39216: stderr chunk (state=3): >>><<< 30575 1726867581.39219: stdout chunk (state=3): >>><<< 30575 1726867581.39232: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867581.39235: _low_level_execute_command(): starting 30575 1726867581.39240: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/AnsiballZ_command.py && sleep 0' 30575 1726867581.39884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867581.39930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.39953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.40029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.56387: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 17:26:21.551338", "end": "2024-09-20 17:26:21.557585", "delta": "0:00:00.006247", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867581.57516: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867581.57520: stdout chunk (state=3): >>><<< 30575 1726867581.57525: stderr chunk (state=3): >>><<< 30575 1726867581.57619: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 17:26:21.551338", "end": "2024-09-20 17:26:21.557585", "delta": "0:00:00.006247", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867581.57626: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867581.57628: _low_level_execute_command(): starting 30575 1726867581.57630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867581.3106835-31336-96003909789694/ > /dev/null 2>&1 && sleep 0' 30575 1726867581.58919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867581.58925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867581.58940: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867581.58946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.58960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867581.58965: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867581.59152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867581.59160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.59163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867581.59165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.59190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.59305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.61139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.61596: stderr chunk (state=3): >>><<< 30575 1726867581.61599: stdout chunk (state=3): >>><<< 30575 1726867581.61618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867581.61627: handler run complete 30575 1726867581.61649: Evaluated conditional (False): False 30575 1726867581.61682: attempt loop complete, returning result 30575 1726867581.61685: _execute() done 30575 1726867581.61688: dumping result to json 30575 1726867581.61690: done dumping result, returning 30575 1726867581.61693: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [0affcac9-a3a5-e081-a588-00000000063e] 30575 1726867581.61695: sending task result for task 0affcac9-a3a5-e081-a588-00000000063e 30575 1726867581.61797: done sending task result for task 0affcac9-a3a5-e081-a588-00000000063e 30575 1726867581.61801: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.006247", "end": "2024-09-20 17:26:21.557585", "rc": 1, "start": "2024-09-20 17:26:21.551338" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30575 1726867581.61875: no more pending results, returning what we have 30575 1726867581.61883: results queue empty 30575 1726867581.61884: checking for any_errors_fatal 30575 1726867581.61885: done checking for any_errors_fatal 30575 1726867581.61885: checking for max_fail_percentage 30575 1726867581.61887: done checking for max_fail_percentage 30575 1726867581.61888: checking to see if all hosts have failed and the running result is not ok 30575 1726867581.61889: done checking to see if all hosts have failed 30575 1726867581.61890: getting the remaining hosts for this loop 30575 1726867581.61891: done getting the remaining hosts for this loop 30575 1726867581.61896: getting the next task for host managed_node3 30575 1726867581.61905: done getting next task for host managed_node3 30575 1726867581.61908: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867581.61912: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867581.61917: getting variables 30575 1726867581.61919: in VariableManager get_vars() 30575 1726867581.61954: Calling all_inventory to load vars for managed_node3 30575 1726867581.61957: Calling groups_inventory to load vars for managed_node3 30575 1726867581.61960: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867581.61972: Calling all_plugins_play to load vars for managed_node3 30575 1726867581.61975: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867581.62183: Calling groups_plugins_play to load vars for managed_node3 30575 1726867581.65502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867581.67349: done with get_vars() 30575 1726867581.67374: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:26:21 -0400 (0:00:00.409) 0:00:17.052 ****** 30575 1726867581.67469: entering _queue_task() for managed_node3/include_tasks 30575 1726867581.67880: worker is 1 (out of 1 available) 30575 1726867581.67892: exiting _queue_task() for managed_node3/include_tasks 30575 1726867581.67903: done queuing things up, now waiting for results queue to drain 30575 1726867581.67905: waiting for pending results... 30575 1726867581.68497: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867581.68506: in run() - task 0affcac9-a3a5-e081-a588-000000000642 30575 1726867581.68510: variable 'ansible_search_path' from source: unknown 30575 1726867581.68513: variable 'ansible_search_path' from source: unknown 30575 1726867581.68517: calling self._execute() 30575 1726867581.68520: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.68525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.68529: variable 'omit' from source: magic vars 30575 1726867581.68982: variable 'ansible_distribution_major_version' from source: facts 30575 1726867581.68985: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867581.68988: _execute() done 30575 1726867581.68990: dumping result to json 30575 1726867581.68992: done dumping result, returning 30575 1726867581.68994: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-000000000642] 30575 1726867581.68995: sending task result for task 0affcac9-a3a5-e081-a588-000000000642 30575 1726867581.69059: done sending task result for task 0affcac9-a3a5-e081-a588-000000000642 30575 1726867581.69062: WORKER PROCESS EXITING 30575 1726867581.69101: no more pending results, returning what we have 30575 1726867581.69106: in VariableManager get_vars() 30575 1726867581.69138: Calling all_inventory to load vars for managed_node3 30575 1726867581.69143: Calling groups_inventory to load vars for managed_node3 30575 1726867581.69146: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867581.69157: Calling all_plugins_play to load vars for managed_node3 30575 1726867581.69159: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867581.69162: Calling groups_plugins_play to load vars for managed_node3 30575 1726867581.75485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867581.76974: done with get_vars() 30575 1726867581.77004: variable 'ansible_search_path' from source: unknown 30575 1726867581.77005: variable 'ansible_search_path' from source: unknown 30575 1726867581.77015: variable 'item' from source: include params 30575 1726867581.77101: variable 'item' from source: include params 30575 1726867581.77138: we have included files to process 30575 1726867581.77139: generating all_blocks data 30575 1726867581.77141: done generating all_blocks data 30575 1726867581.77143: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867581.77145: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867581.77147: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867581.77316: done processing included file 30575 1726867581.77318: iterating over new_blocks loaded from include file 30575 1726867581.77319: in VariableManager get_vars() 30575 1726867581.77338: done with get_vars() 30575 1726867581.77340: filtering new block on tags 30575 1726867581.77364: done filtering new block on tags 30575 1726867581.77366: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867581.77371: extending task lists for all hosts with included blocks 30575 1726867581.77537: done extending task lists 30575 1726867581.77544: done processing included files 30575 1726867581.77545: results queue empty 30575 1726867581.77546: checking for any_errors_fatal 30575 1726867581.77549: done checking for any_errors_fatal 30575 1726867581.77550: checking for max_fail_percentage 30575 1726867581.77551: done checking for max_fail_percentage 30575 1726867581.77552: checking to see if all hosts have failed and the running result is not ok 30575 1726867581.77553: done checking to see if all hosts have failed 30575 1726867581.77553: getting the remaining hosts for this loop 30575 1726867581.77555: done getting the remaining hosts for this loop 30575 1726867581.77557: getting the next task for host managed_node3 30575 1726867581.77561: done getting next task for host managed_node3 30575 1726867581.77563: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867581.77566: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867581.77568: getting variables 30575 1726867581.77569: in VariableManager get_vars() 30575 1726867581.77580: Calling all_inventory to load vars for managed_node3 30575 1726867581.77582: Calling groups_inventory to load vars for managed_node3 30575 1726867581.77584: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867581.77590: Calling all_plugins_play to load vars for managed_node3 30575 1726867581.77592: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867581.77595: Calling groups_plugins_play to load vars for managed_node3 30575 1726867581.78685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867581.80400: done with get_vars() 30575 1726867581.80425: done getting variables 30575 1726867581.80560: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:26:21 -0400 (0:00:00.131) 0:00:17.183 ****** 30575 1726867581.80623: entering _queue_task() for managed_node3/stat 30575 1726867581.80994: worker is 1 (out of 1 available) 30575 1726867581.81006: exiting _queue_task() for managed_node3/stat 30575 1726867581.81385: done queuing things up, now waiting for results queue to drain 30575 1726867581.81389: waiting for pending results... 30575 1726867581.81608: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867581.81848: in run() - task 0affcac9-a3a5-e081-a588-000000000691 30575 1726867581.81860: variable 'ansible_search_path' from source: unknown 30575 1726867581.81864: variable 'ansible_search_path' from source: unknown 30575 1726867581.81903: calling self._execute() 30575 1726867581.82106: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.82113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.82125: variable 'omit' from source: magic vars 30575 1726867581.82929: variable 'ansible_distribution_major_version' from source: facts 30575 1726867581.82939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867581.82945: variable 'omit' from source: magic vars 30575 1726867581.83012: variable 'omit' from source: magic vars 30575 1726867581.83113: variable 'interface' from source: play vars 30575 1726867581.83131: variable 'omit' from source: magic vars 30575 1726867581.83197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867581.83382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867581.83387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867581.83390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.83393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867581.83394: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867581.83397: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.83399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.83429: Set connection var ansible_pipelining to False 30575 1726867581.83432: Set connection var ansible_shell_type to sh 30575 1726867581.83436: Set connection var ansible_shell_executable to /bin/sh 30575 1726867581.83443: Set connection var ansible_timeout to 10 30575 1726867581.83448: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867581.83455: Set connection var ansible_connection to ssh 30575 1726867581.83484: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.83488: variable 'ansible_connection' from source: unknown 30575 1726867581.83494: variable 'ansible_module_compression' from source: unknown 30575 1726867581.83497: variable 'ansible_shell_type' from source: unknown 30575 1726867581.83500: variable 'ansible_shell_executable' from source: unknown 30575 1726867581.83502: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867581.83506: variable 'ansible_pipelining' from source: unknown 30575 1726867581.83508: variable 'ansible_timeout' from source: unknown 30575 1726867581.83513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867581.83718: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867581.83728: variable 'omit' from source: magic vars 30575 1726867581.83734: starting attempt loop 30575 1726867581.83736: running the handler 30575 1726867581.83749: _low_level_execute_command(): starting 30575 1726867581.83757: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867581.84732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.84737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867581.84739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.84928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.85250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.86768: stdout chunk (state=3): >>>/root <<< 30575 1726867581.86893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.86914: stderr chunk (state=3): >>><<< 30575 1726867581.86925: stdout chunk (state=3): >>><<< 30575 1726867581.86951: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867581.86964: _low_level_execute_command(): starting 30575 1726867581.86970: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205 `" && echo ansible-tmp-1726867581.8695118-31361-31428379759205="` echo /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205 `" ) && sleep 0' 30575 1726867581.88510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867581.88514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867581.88517: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.88519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867581.88533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.88610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867581.88733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.88738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.88829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.90708: stdout chunk (state=3): >>>ansible-tmp-1726867581.8695118-31361-31428379759205=/root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205 <<< 30575 1726867581.90879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.90883: stdout chunk (state=3): >>><<< 30575 1726867581.90886: stderr chunk (state=3): >>><<< 30575 1726867581.90906: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867581.8695118-31361-31428379759205=/root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867581.91088: variable 'ansible_module_compression' from source: unknown 30575 1726867581.91091: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867581.91094: variable 'ansible_facts' from source: unknown 30575 1726867581.91162: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/AnsiballZ_stat.py 30575 1726867581.91510: Sending initial data 30575 1726867581.91586: Sent initial data (152 bytes) 30575 1726867581.92135: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867581.92148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867581.92161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867581.92196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.92294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867581.92312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.92337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.92418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.93940: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867581.93957: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30575 1726867581.93975: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30575 1726867581.94010: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867581.94064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867581.94115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpx_bmi__3 /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/AnsiballZ_stat.py <<< 30575 1726867581.94118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/AnsiballZ_stat.py" <<< 30575 1726867581.94161: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpx_bmi__3" to remote "/root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/AnsiballZ_stat.py" <<< 30575 1726867581.95419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.95422: stdout chunk (state=3): >>><<< 30575 1726867581.95428: stderr chunk (state=3): >>><<< 30575 1726867581.95430: done transferring module to remote 30575 1726867581.95440: _low_level_execute_command(): starting 30575 1726867581.95443: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/ /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/AnsiballZ_stat.py && sleep 0' 30575 1726867581.96435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867581.96493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.96616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867581.96661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867581.98437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867581.98449: stderr chunk (state=3): >>><<< 30575 1726867581.98458: stdout chunk (state=3): >>><<< 30575 1726867581.98483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867581.98574: _low_level_execute_command(): starting 30575 1726867581.98580: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/AnsiballZ_stat.py && sleep 0' 30575 1726867581.99827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867581.99856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867581.99992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867582.00038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867582.15098: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867582.16280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867582.16296: stdout chunk (state=3): >>><<< 30575 1726867582.16321: stderr chunk (state=3): >>><<< 30575 1726867582.16344: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867582.16380: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867582.16397: _low_level_execute_command(): starting 30575 1726867582.16407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867581.8695118-31361-31428379759205/ > /dev/null 2>&1 && sleep 0' 30575 1726867582.17026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867582.17046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867582.17089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867582.17103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867582.17116: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867582.17156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867582.17210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867582.17228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867582.17279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867582.17328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867582.19173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867582.19176: stdout chunk (state=3): >>><<< 30575 1726867582.19386: stderr chunk (state=3): >>><<< 30575 1726867582.19389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867582.19391: handler run complete 30575 1726867582.19393: attempt loop complete, returning result 30575 1726867582.19395: _execute() done 30575 1726867582.19397: dumping result to json 30575 1726867582.19398: done dumping result, returning 30575 1726867582.19400: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-000000000691] 30575 1726867582.19401: sending task result for task 0affcac9-a3a5-e081-a588-000000000691 30575 1726867582.19463: done sending task result for task 0affcac9-a3a5-e081-a588-000000000691 30575 1726867582.19466: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867582.19552: no more pending results, returning what we have 30575 1726867582.19556: results queue empty 30575 1726867582.19557: checking for any_errors_fatal 30575 1726867582.19558: done checking for any_errors_fatal 30575 1726867582.19559: checking for max_fail_percentage 30575 1726867582.19560: done checking for max_fail_percentage 30575 1726867582.19561: checking to see if all hosts have failed and the running result is not ok 30575 1726867582.19562: done checking to see if all hosts have failed 30575 1726867582.19563: getting the remaining hosts for this loop 30575 1726867582.19565: done getting the remaining hosts for this loop 30575 1726867582.19569: getting the next task for host managed_node3 30575 1726867582.19580: done getting next task for host managed_node3 30575 1726867582.19585: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30575 1726867582.19589: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867582.19595: getting variables 30575 1726867582.19596: in VariableManager get_vars() 30575 1726867582.19714: Calling all_inventory to load vars for managed_node3 30575 1726867582.19717: Calling groups_inventory to load vars for managed_node3 30575 1726867582.19721: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.19734: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.19737: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.19740: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.21333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.22995: done with get_vars() 30575 1726867582.23016: done getting variables 30575 1726867582.23083: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867582.23208: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:26:22 -0400 (0:00:00.426) 0:00:17.609 ****** 30575 1726867582.23242: entering _queue_task() for managed_node3/assert 30575 1726867582.23584: worker is 1 (out of 1 available) 30575 1726867582.23597: exiting _queue_task() for managed_node3/assert 30575 1726867582.23608: done queuing things up, now waiting for results queue to drain 30575 1726867582.23609: waiting for pending results... 30575 1726867582.23906: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30575 1726867582.24034: in run() - task 0affcac9-a3a5-e081-a588-000000000643 30575 1726867582.24048: variable 'ansible_search_path' from source: unknown 30575 1726867582.24051: variable 'ansible_search_path' from source: unknown 30575 1726867582.24091: calling self._execute() 30575 1726867582.24182: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.24187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.24197: variable 'omit' from source: magic vars 30575 1726867582.24586: variable 'ansible_distribution_major_version' from source: facts 30575 1726867582.24600: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867582.24603: variable 'omit' from source: magic vars 30575 1726867582.24648: variable 'omit' from source: magic vars 30575 1726867582.24753: variable 'interface' from source: play vars 30575 1726867582.24776: variable 'omit' from source: magic vars 30575 1726867582.24821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867582.24982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867582.24986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867582.24989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867582.24992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867582.24995: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867582.25001: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.25004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.25059: Set connection var ansible_pipelining to False 30575 1726867582.25062: Set connection var ansible_shell_type to sh 30575 1726867582.25069: Set connection var ansible_shell_executable to /bin/sh 30575 1726867582.25074: Set connection var ansible_timeout to 10 30575 1726867582.25081: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867582.25089: Set connection var ansible_connection to ssh 30575 1726867582.25121: variable 'ansible_shell_executable' from source: unknown 30575 1726867582.25124: variable 'ansible_connection' from source: unknown 30575 1726867582.25129: variable 'ansible_module_compression' from source: unknown 30575 1726867582.25132: variable 'ansible_shell_type' from source: unknown 30575 1726867582.25134: variable 'ansible_shell_executable' from source: unknown 30575 1726867582.25136: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.25141: variable 'ansible_pipelining' from source: unknown 30575 1726867582.25143: variable 'ansible_timeout' from source: unknown 30575 1726867582.25147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.25290: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867582.25300: variable 'omit' from source: magic vars 30575 1726867582.25306: starting attempt loop 30575 1726867582.25314: running the handler 30575 1726867582.25464: variable 'interface_stat' from source: set_fact 30575 1726867582.25473: Evaluated conditional (not interface_stat.stat.exists): True 30575 1726867582.25480: handler run complete 30575 1726867582.25682: attempt loop complete, returning result 30575 1726867582.25685: _execute() done 30575 1726867582.25687: dumping result to json 30575 1726867582.25689: done dumping result, returning 30575 1726867582.25690: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcac9-a3a5-e081-a588-000000000643] 30575 1726867582.25692: sending task result for task 0affcac9-a3a5-e081-a588-000000000643 30575 1726867582.25751: done sending task result for task 0affcac9-a3a5-e081-a588-000000000643 30575 1726867582.25755: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867582.25796: no more pending results, returning what we have 30575 1726867582.25799: results queue empty 30575 1726867582.25800: checking for any_errors_fatal 30575 1726867582.25806: done checking for any_errors_fatal 30575 1726867582.25806: checking for max_fail_percentage 30575 1726867582.25808: done checking for max_fail_percentage 30575 1726867582.25809: checking to see if all hosts have failed and the running result is not ok 30575 1726867582.25810: done checking to see if all hosts have failed 30575 1726867582.25810: getting the remaining hosts for this loop 30575 1726867582.25812: done getting the remaining hosts for this loop 30575 1726867582.25815: getting the next task for host managed_node3 30575 1726867582.25822: done getting next task for host managed_node3 30575 1726867582.25828: ^ task is: TASK: Test 30575 1726867582.25831: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867582.25835: getting variables 30575 1726867582.25836: in VariableManager get_vars() 30575 1726867582.25969: Calling all_inventory to load vars for managed_node3 30575 1726867582.25971: Calling groups_inventory to load vars for managed_node3 30575 1726867582.25975: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.25985: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.25988: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.25991: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.27281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.28943: done with get_vars() 30575 1726867582.28965: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:26:22 -0400 (0:00:00.058) 0:00:17.668 ****** 30575 1726867582.29074: entering _queue_task() for managed_node3/include_tasks 30575 1726867582.29371: worker is 1 (out of 1 available) 30575 1726867582.29385: exiting _queue_task() for managed_node3/include_tasks 30575 1726867582.29397: done queuing things up, now waiting for results queue to drain 30575 1726867582.29398: waiting for pending results... 30575 1726867582.29713: running TaskExecutor() for managed_node3/TASK: Test 30575 1726867582.29794: in run() - task 0affcac9-a3a5-e081-a588-0000000005b8 30575 1726867582.29813: variable 'ansible_search_path' from source: unknown 30575 1726867582.29816: variable 'ansible_search_path' from source: unknown 30575 1726867582.29862: variable 'lsr_test' from source: include params 30575 1726867582.30079: variable 'lsr_test' from source: include params 30575 1726867582.30151: variable 'omit' from source: magic vars 30575 1726867582.30279: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.30287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.30297: variable 'omit' from source: magic vars 30575 1726867582.30548: variable 'ansible_distribution_major_version' from source: facts 30575 1726867582.30746: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867582.30750: variable 'item' from source: unknown 30575 1726867582.30752: variable 'item' from source: unknown 30575 1726867582.30755: variable 'item' from source: unknown 30575 1726867582.30757: variable 'item' from source: unknown 30575 1726867582.30852: dumping result to json 30575 1726867582.30855: done dumping result, returning 30575 1726867582.30858: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-e081-a588-0000000005b8] 30575 1726867582.30861: sending task result for task 0affcac9-a3a5-e081-a588-0000000005b8 30575 1726867582.30922: no more pending results, returning what we have 30575 1726867582.30930: in VariableManager get_vars() 30575 1726867582.30965: Calling all_inventory to load vars for managed_node3 30575 1726867582.30968: Calling groups_inventory to load vars for managed_node3 30575 1726867582.30972: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.30989: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.30991: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.30997: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005b8 30575 1726867582.31000: WORKER PROCESS EXITING 30575 1726867582.31086: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.33358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.35497: done with get_vars() 30575 1726867582.35585: variable 'ansible_search_path' from source: unknown 30575 1726867582.35587: variable 'ansible_search_path' from source: unknown 30575 1726867582.35641: we have included files to process 30575 1726867582.35642: generating all_blocks data 30575 1726867582.35644: done generating all_blocks data 30575 1726867582.35649: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30575 1726867582.35650: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30575 1726867582.35655: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30575 1726867582.35983: done processing included file 30575 1726867582.35985: iterating over new_blocks loaded from include file 30575 1726867582.35987: in VariableManager get_vars() 30575 1726867582.36002: done with get_vars() 30575 1726867582.36003: filtering new block on tags 30575 1726867582.36038: done filtering new block on tags 30575 1726867582.36040: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed_node3 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 30575 1726867582.36045: extending task lists for all hosts with included blocks 30575 1726867582.36925: done extending task lists 30575 1726867582.36931: done processing included files 30575 1726867582.36932: results queue empty 30575 1726867582.36933: checking for any_errors_fatal 30575 1726867582.36937: done checking for any_errors_fatal 30575 1726867582.36938: checking for max_fail_percentage 30575 1726867582.36939: done checking for max_fail_percentage 30575 1726867582.36940: checking to see if all hosts have failed and the running result is not ok 30575 1726867582.36941: done checking to see if all hosts have failed 30575 1726867582.36941: getting the remaining hosts for this loop 30575 1726867582.36942: done getting the remaining hosts for this loop 30575 1726867582.36945: getting the next task for host managed_node3 30575 1726867582.36950: done getting next task for host managed_node3 30575 1726867582.36952: ^ task is: TASK: Include network role 30575 1726867582.36955: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867582.36958: getting variables 30575 1726867582.36959: in VariableManager get_vars() 30575 1726867582.36970: Calling all_inventory to load vars for managed_node3 30575 1726867582.36972: Calling groups_inventory to load vars for managed_node3 30575 1726867582.36974: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.36982: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.36984: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.36987: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.38909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.41914: done with get_vars() 30575 1726867582.41936: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Friday 20 September 2024 17:26:22 -0400 (0:00:00.129) 0:00:17.797 ****** 30575 1726867582.42016: entering _queue_task() for managed_node3/include_role 30575 1726867582.42772: worker is 1 (out of 1 available) 30575 1726867582.42788: exiting _queue_task() for managed_node3/include_role 30575 1726867582.42802: done queuing things up, now waiting for results queue to drain 30575 1726867582.42803: waiting for pending results... 30575 1726867582.43261: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867582.43782: in run() - task 0affcac9-a3a5-e081-a588-0000000006b1 30575 1726867582.43786: variable 'ansible_search_path' from source: unknown 30575 1726867582.43788: variable 'ansible_search_path' from source: unknown 30575 1726867582.43791: calling self._execute() 30575 1726867582.43793: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.43796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.43797: variable 'omit' from source: magic vars 30575 1726867582.44979: variable 'ansible_distribution_major_version' from source: facts 30575 1726867582.44983: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867582.44986: _execute() done 30575 1726867582.44988: dumping result to json 30575 1726867582.44991: done dumping result, returning 30575 1726867582.44993: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-0000000006b1] 30575 1726867582.44996: sending task result for task 0affcac9-a3a5-e081-a588-0000000006b1 30575 1726867582.45109: no more pending results, returning what we have 30575 1726867582.45115: in VariableManager get_vars() 30575 1726867582.45159: Calling all_inventory to load vars for managed_node3 30575 1726867582.45162: Calling groups_inventory to load vars for managed_node3 30575 1726867582.45166: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.45182: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.45186: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.45190: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.45888: done sending task result for task 0affcac9-a3a5-e081-a588-0000000006b1 30575 1726867582.46402: WORKER PROCESS EXITING 30575 1726867582.48332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.51272: done with get_vars() 30575 1726867582.51301: variable 'ansible_search_path' from source: unknown 30575 1726867582.51303: variable 'ansible_search_path' from source: unknown 30575 1726867582.51712: variable 'omit' from source: magic vars 30575 1726867582.51758: variable 'omit' from source: magic vars 30575 1726867582.51773: variable 'omit' from source: magic vars 30575 1726867582.51980: we have included files to process 30575 1726867582.51982: generating all_blocks data 30575 1726867582.51984: done generating all_blocks data 30575 1726867582.51985: processing included file: fedora.linux_system_roles.network 30575 1726867582.52010: in VariableManager get_vars() 30575 1726867582.52024: done with get_vars() 30575 1726867582.52054: in VariableManager get_vars() 30575 1726867582.52072: done with get_vars() 30575 1726867582.52116: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867582.52440: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867582.52515: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867582.53595: in VariableManager get_vars() 30575 1726867582.53617: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867582.57317: iterating over new_blocks loaded from include file 30575 1726867582.57320: in VariableManager get_vars() 30575 1726867582.57339: done with get_vars() 30575 1726867582.57341: filtering new block on tags 30575 1726867582.58115: done filtering new block on tags 30575 1726867582.58119: in VariableManager get_vars() 30575 1726867582.58135: done with get_vars() 30575 1726867582.58137: filtering new block on tags 30575 1726867582.58155: done filtering new block on tags 30575 1726867582.58157: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867582.58163: extending task lists for all hosts with included blocks 30575 1726867582.58521: done extending task lists 30575 1726867582.58523: done processing included files 30575 1726867582.58523: results queue empty 30575 1726867582.58524: checking for any_errors_fatal 30575 1726867582.58527: done checking for any_errors_fatal 30575 1726867582.58528: checking for max_fail_percentage 30575 1726867582.58529: done checking for max_fail_percentage 30575 1726867582.58530: checking to see if all hosts have failed and the running result is not ok 30575 1726867582.58531: done checking to see if all hosts have failed 30575 1726867582.58532: getting the remaining hosts for this loop 30575 1726867582.58533: done getting the remaining hosts for this loop 30575 1726867582.58536: getting the next task for host managed_node3 30575 1726867582.58540: done getting next task for host managed_node3 30575 1726867582.58543: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867582.58546: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867582.58556: getting variables 30575 1726867582.58557: in VariableManager get_vars() 30575 1726867582.58568: Calling all_inventory to load vars for managed_node3 30575 1726867582.58571: Calling groups_inventory to load vars for managed_node3 30575 1726867582.58573: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.58782: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.58786: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.58790: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.60958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.63371: done with get_vars() 30575 1726867582.63395: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:26:22 -0400 (0:00:00.214) 0:00:18.012 ****** 30575 1726867582.63473: entering _queue_task() for managed_node3/include_tasks 30575 1726867582.63839: worker is 1 (out of 1 available) 30575 1726867582.63853: exiting _queue_task() for managed_node3/include_tasks 30575 1726867582.63865: done queuing things up, now waiting for results queue to drain 30575 1726867582.63866: waiting for pending results... 30575 1726867582.64163: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867582.64289: in run() - task 0affcac9-a3a5-e081-a588-00000000072f 30575 1726867582.64311: variable 'ansible_search_path' from source: unknown 30575 1726867582.64315: variable 'ansible_search_path' from source: unknown 30575 1726867582.64350: calling self._execute() 30575 1726867582.64440: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.64444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.64455: variable 'omit' from source: magic vars 30575 1726867582.64830: variable 'ansible_distribution_major_version' from source: facts 30575 1726867582.64848: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867582.64854: _execute() done 30575 1726867582.64859: dumping result to json 30575 1726867582.64863: done dumping result, returning 30575 1726867582.64872: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-00000000072f] 30575 1726867582.64879: sending task result for task 0affcac9-a3a5-e081-a588-00000000072f 30575 1726867582.64969: done sending task result for task 0affcac9-a3a5-e081-a588-00000000072f 30575 1726867582.64972: WORKER PROCESS EXITING 30575 1726867582.65018: no more pending results, returning what we have 30575 1726867582.65024: in VariableManager get_vars() 30575 1726867582.65067: Calling all_inventory to load vars for managed_node3 30575 1726867582.65070: Calling groups_inventory to load vars for managed_node3 30575 1726867582.65072: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.65087: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.65090: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.65093: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.66908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.70009: done with get_vars() 30575 1726867582.70036: variable 'ansible_search_path' from source: unknown 30575 1726867582.70037: variable 'ansible_search_path' from source: unknown 30575 1726867582.70080: we have included files to process 30575 1726867582.70081: generating all_blocks data 30575 1726867582.70083: done generating all_blocks data 30575 1726867582.70086: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867582.70087: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867582.70090: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867582.70828: done processing included file 30575 1726867582.70831: iterating over new_blocks loaded from include file 30575 1726867582.70833: in VariableManager get_vars() 30575 1726867582.70858: done with get_vars() 30575 1726867582.70860: filtering new block on tags 30575 1726867582.70901: done filtering new block on tags 30575 1726867582.70905: in VariableManager get_vars() 30575 1726867582.70933: done with get_vars() 30575 1726867582.70935: filtering new block on tags 30575 1726867582.70989: done filtering new block on tags 30575 1726867582.70992: in VariableManager get_vars() 30575 1726867582.71018: done with get_vars() 30575 1726867582.71020: filtering new block on tags 30575 1726867582.71064: done filtering new block on tags 30575 1726867582.71067: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867582.71072: extending task lists for all hosts with included blocks 30575 1726867582.72874: done extending task lists 30575 1726867582.72875: done processing included files 30575 1726867582.72876: results queue empty 30575 1726867582.72879: checking for any_errors_fatal 30575 1726867582.72882: done checking for any_errors_fatal 30575 1726867582.72882: checking for max_fail_percentage 30575 1726867582.72883: done checking for max_fail_percentage 30575 1726867582.72884: checking to see if all hosts have failed and the running result is not ok 30575 1726867582.72885: done checking to see if all hosts have failed 30575 1726867582.72886: getting the remaining hosts for this loop 30575 1726867582.72887: done getting the remaining hosts for this loop 30575 1726867582.72889: getting the next task for host managed_node3 30575 1726867582.72894: done getting next task for host managed_node3 30575 1726867582.72896: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867582.72900: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867582.72908: getting variables 30575 1726867582.72909: in VariableManager get_vars() 30575 1726867582.72920: Calling all_inventory to load vars for managed_node3 30575 1726867582.72922: Calling groups_inventory to load vars for managed_node3 30575 1726867582.72926: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.72931: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.72933: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.72935: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.75255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.78571: done with get_vars() 30575 1726867582.78602: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:26:22 -0400 (0:00:00.154) 0:00:18.166 ****** 30575 1726867582.78920: entering _queue_task() for managed_node3/setup 30575 1726867582.79403: worker is 1 (out of 1 available) 30575 1726867582.79533: exiting _queue_task() for managed_node3/setup 30575 1726867582.79546: done queuing things up, now waiting for results queue to drain 30575 1726867582.79549: waiting for pending results... 30575 1726867582.79985: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867582.80031: in run() - task 0affcac9-a3a5-e081-a588-00000000078c 30575 1726867582.80049: variable 'ansible_search_path' from source: unknown 30575 1726867582.80053: variable 'ansible_search_path' from source: unknown 30575 1726867582.80093: calling self._execute() 30575 1726867582.80184: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.80191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.80195: variable 'omit' from source: magic vars 30575 1726867582.80556: variable 'ansible_distribution_major_version' from source: facts 30575 1726867582.80567: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867582.80842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867582.83646: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867582.83710: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867582.83745: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867582.83783: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867582.83812: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867582.83900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867582.83987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867582.83990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867582.83993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867582.84281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867582.84285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867582.84288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867582.84290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867582.84292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867582.84294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867582.84296: variable '__network_required_facts' from source: role '' defaults 30575 1726867582.84299: variable 'ansible_facts' from source: unknown 30575 1726867582.84953: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867582.84960: when evaluation is False, skipping this task 30575 1726867582.84963: _execute() done 30575 1726867582.84966: dumping result to json 30575 1726867582.84968: done dumping result, returning 30575 1726867582.84971: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-00000000078c] 30575 1726867582.84981: sending task result for task 0affcac9-a3a5-e081-a588-00000000078c 30575 1726867582.85180: done sending task result for task 0affcac9-a3a5-e081-a588-00000000078c skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867582.85222: no more pending results, returning what we have 30575 1726867582.85228: results queue empty 30575 1726867582.85229: checking for any_errors_fatal 30575 1726867582.85230: done checking for any_errors_fatal 30575 1726867582.85231: checking for max_fail_percentage 30575 1726867582.85232: done checking for max_fail_percentage 30575 1726867582.85233: checking to see if all hosts have failed and the running result is not ok 30575 1726867582.85234: done checking to see if all hosts have failed 30575 1726867582.85234: getting the remaining hosts for this loop 30575 1726867582.85235: done getting the remaining hosts for this loop 30575 1726867582.85239: getting the next task for host managed_node3 30575 1726867582.85250: done getting next task for host managed_node3 30575 1726867582.85253: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867582.85259: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867582.85274: getting variables 30575 1726867582.85276: in VariableManager get_vars() 30575 1726867582.85308: Calling all_inventory to load vars for managed_node3 30575 1726867582.85310: Calling groups_inventory to load vars for managed_node3 30575 1726867582.85312: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.85325: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.85328: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.85332: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.86006: WORKER PROCESS EXITING 30575 1726867582.88400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.91359: done with get_vars() 30575 1726867582.91385: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:26:22 -0400 (0:00:00.125) 0:00:18.292 ****** 30575 1726867582.91490: entering _queue_task() for managed_node3/stat 30575 1726867582.91839: worker is 1 (out of 1 available) 30575 1726867582.91851: exiting _queue_task() for managed_node3/stat 30575 1726867582.91862: done queuing things up, now waiting for results queue to drain 30575 1726867582.91864: waiting for pending results... 30575 1726867582.92149: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867582.92321: in run() - task 0affcac9-a3a5-e081-a588-00000000078e 30575 1726867582.92347: variable 'ansible_search_path' from source: unknown 30575 1726867582.92356: variable 'ansible_search_path' from source: unknown 30575 1726867582.92409: calling self._execute() 30575 1726867582.92511: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867582.92534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867582.92584: variable 'omit' from source: magic vars 30575 1726867582.93006: variable 'ansible_distribution_major_version' from source: facts 30575 1726867582.93026: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867582.93384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867582.93907: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867582.93962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867582.94054: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867582.94152: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867582.94315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867582.94389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867582.94470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867582.94559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867582.94685: variable '__network_is_ostree' from source: set_fact 30575 1726867582.94704: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867582.94733: when evaluation is False, skipping this task 30575 1726867582.94764: _execute() done 30575 1726867582.94767: dumping result to json 30575 1726867582.94770: done dumping result, returning 30575 1726867582.94773: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-00000000078e] 30575 1726867582.94775: sending task result for task 0affcac9-a3a5-e081-a588-00000000078e skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867582.95052: no more pending results, returning what we have 30575 1726867582.95056: results queue empty 30575 1726867582.95057: checking for any_errors_fatal 30575 1726867582.95067: done checking for any_errors_fatal 30575 1726867582.95067: checking for max_fail_percentage 30575 1726867582.95069: done checking for max_fail_percentage 30575 1726867582.95070: checking to see if all hosts have failed and the running result is not ok 30575 1726867582.95071: done checking to see if all hosts have failed 30575 1726867582.95072: getting the remaining hosts for this loop 30575 1726867582.95073: done getting the remaining hosts for this loop 30575 1726867582.95079: getting the next task for host managed_node3 30575 1726867582.95092: done getting next task for host managed_node3 30575 1726867582.95095: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867582.95101: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867582.95128: getting variables 30575 1726867582.95131: in VariableManager get_vars() 30575 1726867582.95166: Calling all_inventory to load vars for managed_node3 30575 1726867582.95169: Calling groups_inventory to load vars for managed_node3 30575 1726867582.95171: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867582.95288: Calling all_plugins_play to load vars for managed_node3 30575 1726867582.95292: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867582.95297: done sending task result for task 0affcac9-a3a5-e081-a588-00000000078e 30575 1726867582.95300: WORKER PROCESS EXITING 30575 1726867582.95304: Calling groups_plugins_play to load vars for managed_node3 30575 1726867582.96886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867582.99546: done with get_vars() 30575 1726867582.99568: done getting variables 30575 1726867582.99631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:26:22 -0400 (0:00:00.081) 0:00:18.374 ****** 30575 1726867582.99678: entering _queue_task() for managed_node3/set_fact 30575 1726867583.00032: worker is 1 (out of 1 available) 30575 1726867583.00043: exiting _queue_task() for managed_node3/set_fact 30575 1726867583.00056: done queuing things up, now waiting for results queue to drain 30575 1726867583.00058: waiting for pending results... 30575 1726867583.00365: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867583.00548: in run() - task 0affcac9-a3a5-e081-a588-00000000078f 30575 1726867583.00571: variable 'ansible_search_path' from source: unknown 30575 1726867583.00583: variable 'ansible_search_path' from source: unknown 30575 1726867583.00636: calling self._execute() 30575 1726867583.00742: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867583.00781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867583.00798: variable 'omit' from source: magic vars 30575 1726867583.01586: variable 'ansible_distribution_major_version' from source: facts 30575 1726867583.01590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867583.01855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867583.02355: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867583.02407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867583.02445: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867583.02604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867583.02762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867583.02924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867583.02956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867583.02991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867583.03097: variable '__network_is_ostree' from source: set_fact 30575 1726867583.03118: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867583.03127: when evaluation is False, skipping this task 30575 1726867583.03135: _execute() done 30575 1726867583.03143: dumping result to json 30575 1726867583.03152: done dumping result, returning 30575 1726867583.03167: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-00000000078f] 30575 1726867583.03181: sending task result for task 0affcac9-a3a5-e081-a588-00000000078f skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867583.03339: no more pending results, returning what we have 30575 1726867583.03344: results queue empty 30575 1726867583.03345: checking for any_errors_fatal 30575 1726867583.03352: done checking for any_errors_fatal 30575 1726867583.03353: checking for max_fail_percentage 30575 1726867583.03355: done checking for max_fail_percentage 30575 1726867583.03356: checking to see if all hosts have failed and the running result is not ok 30575 1726867583.03357: done checking to see if all hosts have failed 30575 1726867583.03357: getting the remaining hosts for this loop 30575 1726867583.03359: done getting the remaining hosts for this loop 30575 1726867583.03362: getting the next task for host managed_node3 30575 1726867583.03373: done getting next task for host managed_node3 30575 1726867583.03379: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867583.03385: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867583.03402: getting variables 30575 1726867583.03404: in VariableManager get_vars() 30575 1726867583.03439: Calling all_inventory to load vars for managed_node3 30575 1726867583.03441: Calling groups_inventory to load vars for managed_node3 30575 1726867583.03444: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867583.03453: Calling all_plugins_play to load vars for managed_node3 30575 1726867583.03456: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867583.03458: Calling groups_plugins_play to load vars for managed_node3 30575 1726867583.03997: done sending task result for task 0affcac9-a3a5-e081-a588-00000000078f 30575 1726867583.04001: WORKER PROCESS EXITING 30575 1726867583.04332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867583.05546: done with get_vars() 30575 1726867583.05565: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:26:23 -0400 (0:00:00.059) 0:00:18.434 ****** 30575 1726867583.05664: entering _queue_task() for managed_node3/service_facts 30575 1726867583.06106: worker is 1 (out of 1 available) 30575 1726867583.06119: exiting _queue_task() for managed_node3/service_facts 30575 1726867583.06134: done queuing things up, now waiting for results queue to drain 30575 1726867583.06135: waiting for pending results... 30575 1726867583.06507: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867583.06626: in run() - task 0affcac9-a3a5-e081-a588-000000000791 30575 1726867583.06637: variable 'ansible_search_path' from source: unknown 30575 1726867583.06641: variable 'ansible_search_path' from source: unknown 30575 1726867583.06670: calling self._execute() 30575 1726867583.06747: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867583.06751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867583.06760: variable 'omit' from source: magic vars 30575 1726867583.07037: variable 'ansible_distribution_major_version' from source: facts 30575 1726867583.07048: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867583.07052: variable 'omit' from source: magic vars 30575 1726867583.07103: variable 'omit' from source: magic vars 30575 1726867583.07128: variable 'omit' from source: magic vars 30575 1726867583.07158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867583.07187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867583.07203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867583.07215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867583.07229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867583.07252: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867583.07254: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867583.07259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867583.07331: Set connection var ansible_pipelining to False 30575 1726867583.07334: Set connection var ansible_shell_type to sh 30575 1726867583.07340: Set connection var ansible_shell_executable to /bin/sh 30575 1726867583.07345: Set connection var ansible_timeout to 10 30575 1726867583.07350: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867583.07356: Set connection var ansible_connection to ssh 30575 1726867583.07375: variable 'ansible_shell_executable' from source: unknown 30575 1726867583.07380: variable 'ansible_connection' from source: unknown 30575 1726867583.07383: variable 'ansible_module_compression' from source: unknown 30575 1726867583.07387: variable 'ansible_shell_type' from source: unknown 30575 1726867583.07390: variable 'ansible_shell_executable' from source: unknown 30575 1726867583.07392: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867583.07394: variable 'ansible_pipelining' from source: unknown 30575 1726867583.07396: variable 'ansible_timeout' from source: unknown 30575 1726867583.07398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867583.07540: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867583.07548: variable 'omit' from source: magic vars 30575 1726867583.07553: starting attempt loop 30575 1726867583.07556: running the handler 30575 1726867583.07567: _low_level_execute_command(): starting 30575 1726867583.07574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867583.08035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867583.08074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867583.08080: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867583.08083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867583.08086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867583.08130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867583.08133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867583.08136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867583.08199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867583.09890: stdout chunk (state=3): >>>/root <<< 30575 1726867583.10018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867583.10021: stdout chunk (state=3): >>><<< 30575 1726867583.10024: stderr chunk (state=3): >>><<< 30575 1726867583.10284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867583.10288: _low_level_execute_command(): starting 30575 1726867583.10291: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418 `" && echo ansible-tmp-1726867583.1004333-31428-200952313477418="` echo /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418 `" ) && sleep 0' 30575 1726867583.10606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867583.10616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867583.10627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867583.10642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867583.10667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867583.10670: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867583.10673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867583.10704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867583.10707: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867583.10710: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867583.10712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867583.10734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867583.10739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867583.10769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867583.10773: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867583.10775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867583.10827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867583.10849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867583.10903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867583.12783: stdout chunk (state=3): >>>ansible-tmp-1726867583.1004333-31428-200952313477418=/root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418 <<< 30575 1726867583.12914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867583.12938: stderr chunk (state=3): >>><<< 30575 1726867583.12942: stdout chunk (state=3): >>><<< 30575 1726867583.12965: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867583.1004333-31428-200952313477418=/root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867583.13012: variable 'ansible_module_compression' from source: unknown 30575 1726867583.13091: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867583.13094: variable 'ansible_facts' from source: unknown 30575 1726867583.13245: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/AnsiballZ_service_facts.py 30575 1726867583.13404: Sending initial data 30575 1726867583.13407: Sent initial data (162 bytes) 30575 1726867583.13931: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867583.13934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867583.13937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867583.13939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867583.13941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867583.13994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867583.14001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867583.14041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867583.15635: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867583.15704: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867583.15903: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7hwoaqz6 /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/AnsiballZ_service_facts.py <<< 30575 1726867583.15906: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/AnsiballZ_service_facts.py" <<< 30575 1726867583.15943: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7hwoaqz6" to remote "/root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/AnsiballZ_service_facts.py" <<< 30575 1726867583.16853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867583.16856: stdout chunk (state=3): >>><<< 30575 1726867583.16859: stderr chunk (state=3): >>><<< 30575 1726867583.17066: done transferring module to remote 30575 1726867583.17069: _low_level_execute_command(): starting 30575 1726867583.17072: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/ /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/AnsiballZ_service_facts.py && sleep 0' 30575 1726867583.17798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867583.17830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867583.17846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867583.17945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867583.17973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867583.17993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867583.18004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867583.18058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867583.19790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867583.19813: stderr chunk (state=3): >>><<< 30575 1726867583.19815: stdout chunk (state=3): >>><<< 30575 1726867583.19826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867583.19888: _low_level_execute_command(): starting 30575 1726867583.19891: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/AnsiballZ_service_facts.py && sleep 0' 30575 1726867583.20244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867583.20248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867583.20251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867583.20296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867583.20300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867583.20351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867584.76139: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867584.77588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867584.77604: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867584.77661: stderr chunk (state=3): >>><<< 30575 1726867584.77681: stdout chunk (state=3): >>><<< 30575 1726867584.77715: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867584.78503: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867584.78506: _low_level_execute_command(): starting 30575 1726867584.78508: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867583.1004333-31428-200952313477418/ > /dev/null 2>&1 && sleep 0' 30575 1726867584.79161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867584.79286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867584.79301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867584.79329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867584.79409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867584.81283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867584.81286: stdout chunk (state=3): >>><<< 30575 1726867584.81288: stderr chunk (state=3): >>><<< 30575 1726867584.81291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867584.81293: handler run complete 30575 1726867584.81740: variable 'ansible_facts' from source: unknown 30575 1726867584.81885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867584.82382: variable 'ansible_facts' from source: unknown 30575 1726867584.82510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867584.82909: attempt loop complete, returning result 30575 1726867584.82912: _execute() done 30575 1726867584.82914: dumping result to json 30575 1726867584.82916: done dumping result, returning 30575 1726867584.82918: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000000791] 30575 1726867584.82920: sending task result for task 0affcac9-a3a5-e081-a588-000000000791 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867584.83903: no more pending results, returning what we have 30575 1726867584.83906: results queue empty 30575 1726867584.83907: checking for any_errors_fatal 30575 1726867584.83911: done checking for any_errors_fatal 30575 1726867584.83911: checking for max_fail_percentage 30575 1726867584.83913: done checking for max_fail_percentage 30575 1726867584.83914: checking to see if all hosts have failed and the running result is not ok 30575 1726867584.83914: done checking to see if all hosts have failed 30575 1726867584.83915: getting the remaining hosts for this loop 30575 1726867584.83916: done getting the remaining hosts for this loop 30575 1726867584.83920: getting the next task for host managed_node3 30575 1726867584.83926: done getting next task for host managed_node3 30575 1726867584.83930: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867584.83936: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867584.83948: getting variables 30575 1726867584.83950: in VariableManager get_vars() 30575 1726867584.83985: Calling all_inventory to load vars for managed_node3 30575 1726867584.83988: Calling groups_inventory to load vars for managed_node3 30575 1726867584.83991: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867584.83996: done sending task result for task 0affcac9-a3a5-e081-a588-000000000791 30575 1726867584.83999: WORKER PROCESS EXITING 30575 1726867584.84008: Calling all_plugins_play to load vars for managed_node3 30575 1726867584.84011: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867584.84014: Calling groups_plugins_play to load vars for managed_node3 30575 1726867584.85201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867584.86418: done with get_vars() 30575 1726867584.86433: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:26:24 -0400 (0:00:01.808) 0:00:20.242 ****** 30575 1726867584.86503: entering _queue_task() for managed_node3/package_facts 30575 1726867584.86727: worker is 1 (out of 1 available) 30575 1726867584.86741: exiting _queue_task() for managed_node3/package_facts 30575 1726867584.86755: done queuing things up, now waiting for results queue to drain 30575 1726867584.86757: waiting for pending results... 30575 1726867584.86939: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867584.87050: in run() - task 0affcac9-a3a5-e081-a588-000000000792 30575 1726867584.87061: variable 'ansible_search_path' from source: unknown 30575 1726867584.87066: variable 'ansible_search_path' from source: unknown 30575 1726867584.87099: calling self._execute() 30575 1726867584.87167: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867584.87171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867584.87181: variable 'omit' from source: magic vars 30575 1726867584.87457: variable 'ansible_distribution_major_version' from source: facts 30575 1726867584.87467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867584.87473: variable 'omit' from source: magic vars 30575 1726867584.87528: variable 'omit' from source: magic vars 30575 1726867584.87554: variable 'omit' from source: magic vars 30575 1726867584.87585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867584.87614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867584.87647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867584.87658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867584.87667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867584.87718: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867584.87721: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867584.87724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867584.87814: Set connection var ansible_pipelining to False 30575 1726867584.87817: Set connection var ansible_shell_type to sh 30575 1726867584.87820: Set connection var ansible_shell_executable to /bin/sh 30575 1726867584.87928: Set connection var ansible_timeout to 10 30575 1726867584.87931: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867584.87934: Set connection var ansible_connection to ssh 30575 1726867584.87936: variable 'ansible_shell_executable' from source: unknown 30575 1726867584.87938: variable 'ansible_connection' from source: unknown 30575 1726867584.87940: variable 'ansible_module_compression' from source: unknown 30575 1726867584.87942: variable 'ansible_shell_type' from source: unknown 30575 1726867584.87944: variable 'ansible_shell_executable' from source: unknown 30575 1726867584.87946: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867584.87948: variable 'ansible_pipelining' from source: unknown 30575 1726867584.87950: variable 'ansible_timeout' from source: unknown 30575 1726867584.87951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867584.88189: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867584.88194: variable 'omit' from source: magic vars 30575 1726867584.88196: starting attempt loop 30575 1726867584.88198: running the handler 30575 1726867584.88199: _low_level_execute_command(): starting 30575 1726867584.88201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867584.88758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867584.88771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867584.88797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867584.88801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867584.88842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867584.88845: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867584.88848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867584.88851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867584.88896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867584.88934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867584.88938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867584.88946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867584.88991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867584.90592: stdout chunk (state=3): >>>/root <<< 30575 1726867584.90691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867584.90713: stderr chunk (state=3): >>><<< 30575 1726867584.90716: stdout chunk (state=3): >>><<< 30575 1726867584.90747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867584.90761: _low_level_execute_command(): starting 30575 1726867584.90767: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485 `" && echo ansible-tmp-1726867584.9074032-31493-73678579970485="` echo /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485 `" ) && sleep 0' 30575 1726867584.91343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867584.91346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867584.91349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867584.91357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867584.91366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867584.91369: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867584.91381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867584.91425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867584.91428: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867584.91431: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867584.91433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867584.91435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867584.91439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867584.91473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867584.91476: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867584.91480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867584.91534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867584.91541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867584.91558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867584.91644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867584.93519: stdout chunk (state=3): >>>ansible-tmp-1726867584.9074032-31493-73678579970485=/root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485 <<< 30575 1726867584.93670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867584.93673: stdout chunk (state=3): >>><<< 30575 1726867584.93675: stderr chunk (state=3): >>><<< 30575 1726867584.93707: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867584.9074032-31493-73678579970485=/root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867584.93982: variable 'ansible_module_compression' from source: unknown 30575 1726867584.93986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867584.93989: variable 'ansible_facts' from source: unknown 30575 1726867584.94045: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/AnsiballZ_package_facts.py 30575 1726867584.94256: Sending initial data 30575 1726867584.94260: Sent initial data (161 bytes) 30575 1726867584.94761: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867584.94766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867584.94769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867584.94844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867584.94869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867584.94936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867584.96464: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867584.96474: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867584.96509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867584.96551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp2_e_9omb /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/AnsiballZ_package_facts.py <<< 30575 1726867584.96560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/AnsiballZ_package_facts.py" <<< 30575 1726867584.96595: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp2_e_9omb" to remote "/root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/AnsiballZ_package_facts.py" <<< 30575 1726867584.96604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/AnsiballZ_package_facts.py" <<< 30575 1726867584.97891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867584.97916: stderr chunk (state=3): >>><<< 30575 1726867584.97920: stdout chunk (state=3): >>><<< 30575 1726867584.97951: done transferring module to remote 30575 1726867584.97960: _low_level_execute_command(): starting 30575 1726867584.97965: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/ /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/AnsiballZ_package_facts.py && sleep 0' 30575 1726867584.98349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867584.98391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867584.98394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867584.98396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867584.98398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867584.98401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867584.98402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867584.98442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867584.98448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867584.98496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867585.00229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867585.00247: stderr chunk (state=3): >>><<< 30575 1726867585.00250: stdout chunk (state=3): >>><<< 30575 1726867585.00260: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867585.00263: _low_level_execute_command(): starting 30575 1726867585.00267: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/AnsiballZ_package_facts.py && sleep 0' 30575 1726867585.00635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867585.00665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867585.00669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867585.00672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867585.00718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867585.00725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867585.00773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867585.44970: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30575 1726867585.45072: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867585.46799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867585.46818: stderr chunk (state=3): >>><<< 30575 1726867585.46841: stdout chunk (state=3): >>><<< 30575 1726867585.46908: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867585.49272: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867585.49369: _low_level_execute_command(): starting 30575 1726867585.49373: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867584.9074032-31493-73678579970485/ > /dev/null 2>&1 && sleep 0' 30575 1726867585.49958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867585.49970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867585.49986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867585.50044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867585.50108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867585.50128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867585.50157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867585.50230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867585.52127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867585.52130: stdout chunk (state=3): >>><<< 30575 1726867585.52133: stderr chunk (state=3): >>><<< 30575 1726867585.52282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867585.52286: handler run complete 30575 1726867585.53006: variable 'ansible_facts' from source: unknown 30575 1726867585.53575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.55536: variable 'ansible_facts' from source: unknown 30575 1726867585.56006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.56766: attempt loop complete, returning result 30575 1726867585.56784: _execute() done 30575 1726867585.56791: dumping result to json 30575 1726867585.57017: done dumping result, returning 30575 1726867585.57036: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000000792] 30575 1726867585.57050: sending task result for task 0affcac9-a3a5-e081-a588-000000000792 30575 1726867585.59360: done sending task result for task 0affcac9-a3a5-e081-a588-000000000792 30575 1726867585.59364: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867585.59514: no more pending results, returning what we have 30575 1726867585.59516: results queue empty 30575 1726867585.59517: checking for any_errors_fatal 30575 1726867585.59521: done checking for any_errors_fatal 30575 1726867585.59522: checking for max_fail_percentage 30575 1726867585.59526: done checking for max_fail_percentage 30575 1726867585.59527: checking to see if all hosts have failed and the running result is not ok 30575 1726867585.59528: done checking to see if all hosts have failed 30575 1726867585.59528: getting the remaining hosts for this loop 30575 1726867585.59529: done getting the remaining hosts for this loop 30575 1726867585.59533: getting the next task for host managed_node3 30575 1726867585.59540: done getting next task for host managed_node3 30575 1726867585.59549: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867585.59555: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867585.59566: getting variables 30575 1726867585.59567: in VariableManager get_vars() 30575 1726867585.59597: Calling all_inventory to load vars for managed_node3 30575 1726867585.59599: Calling groups_inventory to load vars for managed_node3 30575 1726867585.59601: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867585.59610: Calling all_plugins_play to load vars for managed_node3 30575 1726867585.59612: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867585.59615: Calling groups_plugins_play to load vars for managed_node3 30575 1726867585.60884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.62536: done with get_vars() 30575 1726867585.62555: done getting variables 30575 1726867585.62740: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:26:25 -0400 (0:00:00.762) 0:00:21.005 ****** 30575 1726867585.62781: entering _queue_task() for managed_node3/debug 30575 1726867585.63480: worker is 1 (out of 1 available) 30575 1726867585.63604: exiting _queue_task() for managed_node3/debug 30575 1726867585.63616: done queuing things up, now waiting for results queue to drain 30575 1726867585.63618: waiting for pending results... 30575 1726867585.64194: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867585.64338: in run() - task 0affcac9-a3a5-e081-a588-000000000730 30575 1726867585.64541: variable 'ansible_search_path' from source: unknown 30575 1726867585.64545: variable 'ansible_search_path' from source: unknown 30575 1726867585.64548: calling self._execute() 30575 1726867585.64702: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.64715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.64730: variable 'omit' from source: magic vars 30575 1726867585.65448: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.65517: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867585.65591: variable 'omit' from source: magic vars 30575 1726867585.65663: variable 'omit' from source: magic vars 30575 1726867585.66043: variable 'network_provider' from source: set_fact 30575 1726867585.66047: variable 'omit' from source: magic vars 30575 1726867585.66050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867585.66084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867585.66156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867585.66182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867585.66201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867585.66264: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867585.66285: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.66299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.66454: Set connection var ansible_pipelining to False 30575 1726867585.66463: Set connection var ansible_shell_type to sh 30575 1726867585.66475: Set connection var ansible_shell_executable to /bin/sh 30575 1726867585.66550: Set connection var ansible_timeout to 10 30575 1726867585.66553: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867585.66556: Set connection var ansible_connection to ssh 30575 1726867585.66559: variable 'ansible_shell_executable' from source: unknown 30575 1726867585.66561: variable 'ansible_connection' from source: unknown 30575 1726867585.66563: variable 'ansible_module_compression' from source: unknown 30575 1726867585.66565: variable 'ansible_shell_type' from source: unknown 30575 1726867585.66566: variable 'ansible_shell_executable' from source: unknown 30575 1726867585.66574: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.66586: variable 'ansible_pipelining' from source: unknown 30575 1726867585.66594: variable 'ansible_timeout' from source: unknown 30575 1726867585.66602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.66757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867585.66788: variable 'omit' from source: magic vars 30575 1726867585.66800: starting attempt loop 30575 1726867585.66879: running the handler 30575 1726867585.66882: handler run complete 30575 1726867585.66885: attempt loop complete, returning result 30575 1726867585.66887: _execute() done 30575 1726867585.66891: dumping result to json 30575 1726867585.66898: done dumping result, returning 30575 1726867585.66911: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000000730] 30575 1726867585.66921: sending task result for task 0affcac9-a3a5-e081-a588-000000000730 ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867585.67152: no more pending results, returning what we have 30575 1726867585.67156: results queue empty 30575 1726867585.67157: checking for any_errors_fatal 30575 1726867585.67168: done checking for any_errors_fatal 30575 1726867585.67169: checking for max_fail_percentage 30575 1726867585.67171: done checking for max_fail_percentage 30575 1726867585.67172: checking to see if all hosts have failed and the running result is not ok 30575 1726867585.67173: done checking to see if all hosts have failed 30575 1726867585.67174: getting the remaining hosts for this loop 30575 1726867585.67176: done getting the remaining hosts for this loop 30575 1726867585.67182: getting the next task for host managed_node3 30575 1726867585.67191: done getting next task for host managed_node3 30575 1726867585.67197: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867585.67203: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867585.67214: getting variables 30575 1726867585.67216: in VariableManager get_vars() 30575 1726867585.67257: Calling all_inventory to load vars for managed_node3 30575 1726867585.67260: Calling groups_inventory to load vars for managed_node3 30575 1726867585.67263: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867585.67274: Calling all_plugins_play to load vars for managed_node3 30575 1726867585.67310: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867585.67316: done sending task result for task 0affcac9-a3a5-e081-a588-000000000730 30575 1726867585.67319: WORKER PROCESS EXITING 30575 1726867585.67326: Calling groups_plugins_play to load vars for managed_node3 30575 1726867585.68440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.74919: done with get_vars() 30575 1726867585.74947: done getting variables 30575 1726867585.75043: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:26:25 -0400 (0:00:00.122) 0:00:21.128 ****** 30575 1726867585.75080: entering _queue_task() for managed_node3/fail 30575 1726867585.75917: worker is 1 (out of 1 available) 30575 1726867585.75930: exiting _queue_task() for managed_node3/fail 30575 1726867585.75944: done queuing things up, now waiting for results queue to drain 30575 1726867585.75946: waiting for pending results... 30575 1726867585.76502: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867585.76842: in run() - task 0affcac9-a3a5-e081-a588-000000000731 30575 1726867585.76991: variable 'ansible_search_path' from source: unknown 30575 1726867585.76996: variable 'ansible_search_path' from source: unknown 30575 1726867585.77042: calling self._execute() 30575 1726867585.77164: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.77225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.77247: variable 'omit' from source: magic vars 30575 1726867585.77662: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.77683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867585.77809: variable 'network_state' from source: role '' defaults 30575 1726867585.77829: Evaluated conditional (network_state != {}): False 30575 1726867585.77838: when evaluation is False, skipping this task 30575 1726867585.77846: _execute() done 30575 1726867585.77983: dumping result to json 30575 1726867585.77987: done dumping result, returning 30575 1726867585.77990: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-000000000731] 30575 1726867585.77993: sending task result for task 0affcac9-a3a5-e081-a588-000000000731 30575 1726867585.78065: done sending task result for task 0affcac9-a3a5-e081-a588-000000000731 30575 1726867585.78069: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867585.78126: no more pending results, returning what we have 30575 1726867585.78130: results queue empty 30575 1726867585.78131: checking for any_errors_fatal 30575 1726867585.78138: done checking for any_errors_fatal 30575 1726867585.78139: checking for max_fail_percentage 30575 1726867585.78140: done checking for max_fail_percentage 30575 1726867585.78141: checking to see if all hosts have failed and the running result is not ok 30575 1726867585.78142: done checking to see if all hosts have failed 30575 1726867585.78143: getting the remaining hosts for this loop 30575 1726867585.78144: done getting the remaining hosts for this loop 30575 1726867585.78148: getting the next task for host managed_node3 30575 1726867585.78157: done getting next task for host managed_node3 30575 1726867585.78160: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867585.78165: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867585.78279: getting variables 30575 1726867585.78282: in VariableManager get_vars() 30575 1726867585.78314: Calling all_inventory to load vars for managed_node3 30575 1726867585.78317: Calling groups_inventory to load vars for managed_node3 30575 1726867585.78319: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867585.78328: Calling all_plugins_play to load vars for managed_node3 30575 1726867585.78330: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867585.78333: Calling groups_plugins_play to load vars for managed_node3 30575 1726867585.79884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.81624: done with get_vars() 30575 1726867585.81651: done getting variables 30575 1726867585.81714: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:26:25 -0400 (0:00:00.066) 0:00:21.195 ****** 30575 1726867585.81752: entering _queue_task() for managed_node3/fail 30575 1726867585.82067: worker is 1 (out of 1 available) 30575 1726867585.82187: exiting _queue_task() for managed_node3/fail 30575 1726867585.82200: done queuing things up, now waiting for results queue to drain 30575 1726867585.82201: waiting for pending results... 30575 1726867585.82370: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867585.82586: in run() - task 0affcac9-a3a5-e081-a588-000000000732 30575 1726867585.82590: variable 'ansible_search_path' from source: unknown 30575 1726867585.82593: variable 'ansible_search_path' from source: unknown 30575 1726867585.82597: calling self._execute() 30575 1726867585.82698: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.82710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.82726: variable 'omit' from source: magic vars 30575 1726867585.83102: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.83119: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867585.83248: variable 'network_state' from source: role '' defaults 30575 1726867585.83262: Evaluated conditional (network_state != {}): False 30575 1726867585.83271: when evaluation is False, skipping this task 30575 1726867585.83280: _execute() done 30575 1726867585.83289: dumping result to json 30575 1726867585.83296: done dumping result, returning 30575 1726867585.83308: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-000000000732] 30575 1726867585.83318: sending task result for task 0affcac9-a3a5-e081-a588-000000000732 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867585.83505: no more pending results, returning what we have 30575 1726867585.83510: results queue empty 30575 1726867585.83511: checking for any_errors_fatal 30575 1726867585.83518: done checking for any_errors_fatal 30575 1726867585.83518: checking for max_fail_percentage 30575 1726867585.83520: done checking for max_fail_percentage 30575 1726867585.83521: checking to see if all hosts have failed and the running result is not ok 30575 1726867585.83521: done checking to see if all hosts have failed 30575 1726867585.83522: getting the remaining hosts for this loop 30575 1726867585.83525: done getting the remaining hosts for this loop 30575 1726867585.83529: getting the next task for host managed_node3 30575 1726867585.83536: done getting next task for host managed_node3 30575 1726867585.83540: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867585.83544: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867585.83562: getting variables 30575 1726867585.83564: in VariableManager get_vars() 30575 1726867585.83599: Calling all_inventory to load vars for managed_node3 30575 1726867585.83602: Calling groups_inventory to load vars for managed_node3 30575 1726867585.83604: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867585.83614: Calling all_plugins_play to load vars for managed_node3 30575 1726867585.83616: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867585.83619: Calling groups_plugins_play to load vars for managed_node3 30575 1726867585.84190: done sending task result for task 0affcac9-a3a5-e081-a588-000000000732 30575 1726867585.84194: WORKER PROCESS EXITING 30575 1726867585.84558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.85513: done with get_vars() 30575 1726867585.85532: done getting variables 30575 1726867585.85583: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:26:25 -0400 (0:00:00.038) 0:00:21.233 ****** 30575 1726867585.85607: entering _queue_task() for managed_node3/fail 30575 1726867585.85804: worker is 1 (out of 1 available) 30575 1726867585.85818: exiting _queue_task() for managed_node3/fail 30575 1726867585.85833: done queuing things up, now waiting for results queue to drain 30575 1726867585.85834: waiting for pending results... 30575 1726867585.86005: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867585.86101: in run() - task 0affcac9-a3a5-e081-a588-000000000733 30575 1726867585.86112: variable 'ansible_search_path' from source: unknown 30575 1726867585.86115: variable 'ansible_search_path' from source: unknown 30575 1726867585.86145: calling self._execute() 30575 1726867585.86210: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.86215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.86228: variable 'omit' from source: magic vars 30575 1726867585.86596: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.86600: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867585.86710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867585.88184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867585.88236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867585.88264: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867585.88291: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867585.88311: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867585.88368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.88390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.88407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.88433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.88446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.88508: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.88518: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867585.88593: variable 'ansible_distribution' from source: facts 30575 1726867585.88596: variable '__network_rh_distros' from source: role '' defaults 30575 1726867585.88605: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867585.88758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.88778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.88796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.88820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.88832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.88864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.88884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.88901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.88927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.88935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.88965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.88985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.89002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.89028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.89036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.89226: variable 'network_connections' from source: include params 30575 1726867585.89232: variable 'interface' from source: play vars 30575 1726867585.89279: variable 'interface' from source: play vars 30575 1726867585.89289: variable 'network_state' from source: role '' defaults 30575 1726867585.89334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867585.89447: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867585.89475: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867585.89499: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867585.89522: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867585.89552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867585.89567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867585.89589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.89606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867585.89635: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867585.89639: when evaluation is False, skipping this task 30575 1726867585.89641: _execute() done 30575 1726867585.89643: dumping result to json 30575 1726867585.89645: done dumping result, returning 30575 1726867585.89653: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000000733] 30575 1726867585.89656: sending task result for task 0affcac9-a3a5-e081-a588-000000000733 30575 1726867585.89742: done sending task result for task 0affcac9-a3a5-e081-a588-000000000733 30575 1726867585.89745: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867585.89797: no more pending results, returning what we have 30575 1726867585.89799: results queue empty 30575 1726867585.89800: checking for any_errors_fatal 30575 1726867585.89804: done checking for any_errors_fatal 30575 1726867585.89805: checking for max_fail_percentage 30575 1726867585.89806: done checking for max_fail_percentage 30575 1726867585.89807: checking to see if all hosts have failed and the running result is not ok 30575 1726867585.89808: done checking to see if all hosts have failed 30575 1726867585.89808: getting the remaining hosts for this loop 30575 1726867585.89810: done getting the remaining hosts for this loop 30575 1726867585.89813: getting the next task for host managed_node3 30575 1726867585.89820: done getting next task for host managed_node3 30575 1726867585.89827: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867585.89831: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867585.89845: getting variables 30575 1726867585.89846: in VariableManager get_vars() 30575 1726867585.89874: Calling all_inventory to load vars for managed_node3 30575 1726867585.89876: Calling groups_inventory to load vars for managed_node3 30575 1726867585.89880: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867585.89888: Calling all_plugins_play to load vars for managed_node3 30575 1726867585.89890: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867585.89892: Calling groups_plugins_play to load vars for managed_node3 30575 1726867585.90611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.91486: done with get_vars() 30575 1726867585.91499: done getting variables 30575 1726867585.91539: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:26:25 -0400 (0:00:00.059) 0:00:21.293 ****** 30575 1726867585.91562: entering _queue_task() for managed_node3/dnf 30575 1726867585.91767: worker is 1 (out of 1 available) 30575 1726867585.91783: exiting _queue_task() for managed_node3/dnf 30575 1726867585.91795: done queuing things up, now waiting for results queue to drain 30575 1726867585.91796: waiting for pending results... 30575 1726867585.91961: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867585.92053: in run() - task 0affcac9-a3a5-e081-a588-000000000734 30575 1726867585.92064: variable 'ansible_search_path' from source: unknown 30575 1726867585.92067: variable 'ansible_search_path' from source: unknown 30575 1726867585.92101: calling self._execute() 30575 1726867585.92174: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.92181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.92189: variable 'omit' from source: magic vars 30575 1726867585.92461: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.92465: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867585.92782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867585.94803: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867585.94847: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867585.94884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867585.94909: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867585.94931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867585.94990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.95010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.95030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.95055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.95068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.95140: variable 'ansible_distribution' from source: facts 30575 1726867585.95144: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.95155: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867585.95229: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867585.95313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.95332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.95349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.95373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.95387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.95415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.95434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.95450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.95474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.95486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.95515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867585.95534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867585.95550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.95574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867585.95587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867585.95688: variable 'network_connections' from source: include params 30575 1726867585.95710: variable 'interface' from source: play vars 30575 1726867585.95757: variable 'interface' from source: play vars 30575 1726867585.95806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867585.95911: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867585.95940: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867585.95964: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867585.95997: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867585.96026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867585.96044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867585.96070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867585.96086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867585.96131: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867585.96271: variable 'network_connections' from source: include params 30575 1726867585.96275: variable 'interface' from source: play vars 30575 1726867585.96320: variable 'interface' from source: play vars 30575 1726867585.96345: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867585.96348: when evaluation is False, skipping this task 30575 1726867585.96351: _execute() done 30575 1726867585.96353: dumping result to json 30575 1726867585.96357: done dumping result, returning 30575 1726867585.96365: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000734] 30575 1726867585.96370: sending task result for task 0affcac9-a3a5-e081-a588-000000000734 30575 1726867585.96447: done sending task result for task 0affcac9-a3a5-e081-a588-000000000734 30575 1726867585.96450: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867585.96541: no more pending results, returning what we have 30575 1726867585.96544: results queue empty 30575 1726867585.96545: checking for any_errors_fatal 30575 1726867585.96549: done checking for any_errors_fatal 30575 1726867585.96550: checking for max_fail_percentage 30575 1726867585.96551: done checking for max_fail_percentage 30575 1726867585.96552: checking to see if all hosts have failed and the running result is not ok 30575 1726867585.96553: done checking to see if all hosts have failed 30575 1726867585.96553: getting the remaining hosts for this loop 30575 1726867585.96555: done getting the remaining hosts for this loop 30575 1726867585.96558: getting the next task for host managed_node3 30575 1726867585.96563: done getting next task for host managed_node3 30575 1726867585.96566: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867585.96570: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867585.96586: getting variables 30575 1726867585.96587: in VariableManager get_vars() 30575 1726867585.96618: Calling all_inventory to load vars for managed_node3 30575 1726867585.96620: Calling groups_inventory to load vars for managed_node3 30575 1726867585.96622: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867585.96629: Calling all_plugins_play to load vars for managed_node3 30575 1726867585.96631: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867585.96634: Calling groups_plugins_play to load vars for managed_node3 30575 1726867585.97768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867585.98764: done with get_vars() 30575 1726867585.98779: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867585.98830: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:26:25 -0400 (0:00:00.072) 0:00:21.366 ****** 30575 1726867585.98851: entering _queue_task() for managed_node3/yum 30575 1726867585.99061: worker is 1 (out of 1 available) 30575 1726867585.99073: exiting _queue_task() for managed_node3/yum 30575 1726867585.99088: done queuing things up, now waiting for results queue to drain 30575 1726867585.99090: waiting for pending results... 30575 1726867585.99272: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867585.99372: in run() - task 0affcac9-a3a5-e081-a588-000000000735 30575 1726867585.99385: variable 'ansible_search_path' from source: unknown 30575 1726867585.99389: variable 'ansible_search_path' from source: unknown 30575 1726867585.99419: calling self._execute() 30575 1726867585.99489: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867585.99492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867585.99501: variable 'omit' from source: magic vars 30575 1726867585.99771: variable 'ansible_distribution_major_version' from source: facts 30575 1726867585.99782: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867585.99922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867586.02093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867586.02097: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867586.02119: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867586.02155: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867586.02182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867586.02256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.02285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.02309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.02351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.02364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.02458: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.02471: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867586.02474: when evaluation is False, skipping this task 30575 1726867586.02479: _execute() done 30575 1726867586.02482: dumping result to json 30575 1726867586.02486: done dumping result, returning 30575 1726867586.02495: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000735] 30575 1726867586.02499: sending task result for task 0affcac9-a3a5-e081-a588-000000000735 30575 1726867586.02590: done sending task result for task 0affcac9-a3a5-e081-a588-000000000735 30575 1726867586.02593: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867586.02679: no more pending results, returning what we have 30575 1726867586.02682: results queue empty 30575 1726867586.02683: checking for any_errors_fatal 30575 1726867586.02688: done checking for any_errors_fatal 30575 1726867586.02689: checking for max_fail_percentage 30575 1726867586.02690: done checking for max_fail_percentage 30575 1726867586.02691: checking to see if all hosts have failed and the running result is not ok 30575 1726867586.02691: done checking to see if all hosts have failed 30575 1726867586.02692: getting the remaining hosts for this loop 30575 1726867586.02693: done getting the remaining hosts for this loop 30575 1726867586.02697: getting the next task for host managed_node3 30575 1726867586.02704: done getting next task for host managed_node3 30575 1726867586.02708: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867586.02713: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867586.02728: getting variables 30575 1726867586.02729: in VariableManager get_vars() 30575 1726867586.02758: Calling all_inventory to load vars for managed_node3 30575 1726867586.02760: Calling groups_inventory to load vars for managed_node3 30575 1726867586.02762: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867586.02770: Calling all_plugins_play to load vars for managed_node3 30575 1726867586.02772: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867586.02774: Calling groups_plugins_play to load vars for managed_node3 30575 1726867586.04000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867586.05685: done with get_vars() 30575 1726867586.05706: done getting variables 30575 1726867586.05765: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:26:26 -0400 (0:00:00.069) 0:00:21.435 ****** 30575 1726867586.05802: entering _queue_task() for managed_node3/fail 30575 1726867586.06096: worker is 1 (out of 1 available) 30575 1726867586.06109: exiting _queue_task() for managed_node3/fail 30575 1726867586.06123: done queuing things up, now waiting for results queue to drain 30575 1726867586.06124: waiting for pending results... 30575 1726867586.06418: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867586.06569: in run() - task 0affcac9-a3a5-e081-a588-000000000736 30575 1726867586.06594: variable 'ansible_search_path' from source: unknown 30575 1726867586.06607: variable 'ansible_search_path' from source: unknown 30575 1726867586.06647: calling self._execute() 30575 1726867586.06753: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.06766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.06783: variable 'omit' from source: magic vars 30575 1726867586.07174: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.07193: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867586.07365: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867586.07483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867586.09675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867586.09757: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867586.09798: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867586.09835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867586.09869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867586.09950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.09992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.10022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.10069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.10182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.10186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.10188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.10190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.10228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.10246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.10290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.10323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.10350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.10393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.10414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.10587: variable 'network_connections' from source: include params 30575 1726867586.10603: variable 'interface' from source: play vars 30575 1726867586.10673: variable 'interface' from source: play vars 30575 1726867586.10750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867586.10896: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867586.10948: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867586.10993: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867586.11067: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867586.11084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867586.11112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867586.11143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.11182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867586.11248: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867586.11582: variable 'network_connections' from source: include params 30575 1726867586.11585: variable 'interface' from source: play vars 30575 1726867586.11587: variable 'interface' from source: play vars 30575 1726867586.11615: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867586.11623: when evaluation is False, skipping this task 30575 1726867586.11631: _execute() done 30575 1726867586.11638: dumping result to json 30575 1726867586.11646: done dumping result, returning 30575 1726867586.11659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000736] 30575 1726867586.11670: sending task result for task 0affcac9-a3a5-e081-a588-000000000736 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867586.11933: no more pending results, returning what we have 30575 1726867586.11936: results queue empty 30575 1726867586.11937: checking for any_errors_fatal 30575 1726867586.11945: done checking for any_errors_fatal 30575 1726867586.11946: checking for max_fail_percentage 30575 1726867586.11948: done checking for max_fail_percentage 30575 1726867586.11949: checking to see if all hosts have failed and the running result is not ok 30575 1726867586.11950: done checking to see if all hosts have failed 30575 1726867586.11951: getting the remaining hosts for this loop 30575 1726867586.11953: done getting the remaining hosts for this loop 30575 1726867586.11957: getting the next task for host managed_node3 30575 1726867586.11966: done getting next task for host managed_node3 30575 1726867586.11971: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867586.11979: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867586.11999: getting variables 30575 1726867586.12001: in VariableManager get_vars() 30575 1726867586.12039: Calling all_inventory to load vars for managed_node3 30575 1726867586.12042: Calling groups_inventory to load vars for managed_node3 30575 1726867586.12045: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867586.12056: Calling all_plugins_play to load vars for managed_node3 30575 1726867586.12059: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867586.12062: Calling groups_plugins_play to load vars for managed_node3 30575 1726867586.12590: done sending task result for task 0affcac9-a3a5-e081-a588-000000000736 30575 1726867586.12593: WORKER PROCESS EXITING 30575 1726867586.13636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867586.15213: done with get_vars() 30575 1726867586.15233: done getting variables 30575 1726867586.15304: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:26:26 -0400 (0:00:00.095) 0:00:21.530 ****** 30575 1726867586.15342: entering _queue_task() for managed_node3/package 30575 1726867586.15634: worker is 1 (out of 1 available) 30575 1726867586.15648: exiting _queue_task() for managed_node3/package 30575 1726867586.15662: done queuing things up, now waiting for results queue to drain 30575 1726867586.15663: waiting for pending results... 30575 1726867586.15952: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867586.16109: in run() - task 0affcac9-a3a5-e081-a588-000000000737 30575 1726867586.16129: variable 'ansible_search_path' from source: unknown 30575 1726867586.16138: variable 'ansible_search_path' from source: unknown 30575 1726867586.16180: calling self._execute() 30575 1726867586.16275: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.16290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.16306: variable 'omit' from source: magic vars 30575 1726867586.16675: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.16696: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867586.16894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867586.17184: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867586.17240: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867586.17279: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867586.17359: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867586.17492: variable 'network_packages' from source: role '' defaults 30575 1726867586.17660: variable '__network_provider_setup' from source: role '' defaults 30575 1726867586.17664: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867586.17685: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867586.17699: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867586.17761: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867586.18184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867586.20708: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867586.20776: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867586.20815: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867586.20849: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867586.20882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867586.21324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.21355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.21381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.21430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.21443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.21493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.21513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.21540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.21580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.21593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.21827: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867586.21921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.21946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.22045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.22048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.22051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.22117: variable 'ansible_python' from source: facts 30575 1726867586.22121: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867586.22200: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867586.22326: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867586.22405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.22426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.22454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.22504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.22525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.22581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.22623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.22658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.22705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.22725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.22880: variable 'network_connections' from source: include params 30575 1726867586.22924: variable 'interface' from source: play vars 30575 1726867586.23002: variable 'interface' from source: play vars 30575 1726867586.23300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867586.23337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867586.23372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.23422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867586.23525: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867586.23772: variable 'network_connections' from source: include params 30575 1726867586.23776: variable 'interface' from source: play vars 30575 1726867586.23855: variable 'interface' from source: play vars 30575 1726867586.23894: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867586.23949: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867586.24140: variable 'network_connections' from source: include params 30575 1726867586.24143: variable 'interface' from source: play vars 30575 1726867586.24190: variable 'interface' from source: play vars 30575 1726867586.24209: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867586.24264: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867586.24455: variable 'network_connections' from source: include params 30575 1726867586.24459: variable 'interface' from source: play vars 30575 1726867586.24507: variable 'interface' from source: play vars 30575 1726867586.24549: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867586.24593: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867586.24598: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867586.24642: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867586.24775: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867586.25080: variable 'network_connections' from source: include params 30575 1726867586.25084: variable 'interface' from source: play vars 30575 1726867586.25151: variable 'interface' from source: play vars 30575 1726867586.25382: variable 'ansible_distribution' from source: facts 30575 1726867586.25385: variable '__network_rh_distros' from source: role '' defaults 30575 1726867586.25388: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.25390: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867586.25392: variable 'ansible_distribution' from source: facts 30575 1726867586.25394: variable '__network_rh_distros' from source: role '' defaults 30575 1726867586.25396: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.25398: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867586.25587: variable 'ansible_distribution' from source: facts 30575 1726867586.25607: variable '__network_rh_distros' from source: role '' defaults 30575 1726867586.25617: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.25657: variable 'network_provider' from source: set_fact 30575 1726867586.25680: variable 'ansible_facts' from source: unknown 30575 1726867586.26366: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867586.26369: when evaluation is False, skipping this task 30575 1726867586.26372: _execute() done 30575 1726867586.26374: dumping result to json 30575 1726867586.26381: done dumping result, returning 30575 1726867586.26437: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000000737] 30575 1726867586.26449: sending task result for task 0affcac9-a3a5-e081-a588-000000000737 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867586.26682: no more pending results, returning what we have 30575 1726867586.26686: results queue empty 30575 1726867586.26687: checking for any_errors_fatal 30575 1726867586.26695: done checking for any_errors_fatal 30575 1726867586.26695: checking for max_fail_percentage 30575 1726867586.26698: done checking for max_fail_percentage 30575 1726867586.26698: checking to see if all hosts have failed and the running result is not ok 30575 1726867586.26700: done checking to see if all hosts have failed 30575 1726867586.26700: getting the remaining hosts for this loop 30575 1726867586.26702: done getting the remaining hosts for this loop 30575 1726867586.26707: getting the next task for host managed_node3 30575 1726867586.26716: done getting next task for host managed_node3 30575 1726867586.26721: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867586.26726: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867586.26854: done sending task result for task 0affcac9-a3a5-e081-a588-000000000737 30575 1726867586.26858: WORKER PROCESS EXITING 30575 1726867586.26872: getting variables 30575 1726867586.26874: in VariableManager get_vars() 30575 1726867586.26921: Calling all_inventory to load vars for managed_node3 30575 1726867586.26924: Calling groups_inventory to load vars for managed_node3 30575 1726867586.26927: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867586.26939: Calling all_plugins_play to load vars for managed_node3 30575 1726867586.26942: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867586.26945: Calling groups_plugins_play to load vars for managed_node3 30575 1726867586.29022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867586.30935: done with get_vars() 30575 1726867586.30959: done getting variables 30575 1726867586.31016: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:26:26 -0400 (0:00:00.157) 0:00:21.688 ****** 30575 1726867586.31050: entering _queue_task() for managed_node3/package 30575 1726867586.31479: worker is 1 (out of 1 available) 30575 1726867586.31493: exiting _queue_task() for managed_node3/package 30575 1726867586.31506: done queuing things up, now waiting for results queue to drain 30575 1726867586.31508: waiting for pending results... 30575 1726867586.31852: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867586.32016: in run() - task 0affcac9-a3a5-e081-a588-000000000738 30575 1726867586.32020: variable 'ansible_search_path' from source: unknown 30575 1726867586.32023: variable 'ansible_search_path' from source: unknown 30575 1726867586.32050: calling self._execute() 30575 1726867586.32345: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.32354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.32358: variable 'omit' from source: magic vars 30575 1726867586.33343: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.33347: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867586.33350: variable 'network_state' from source: role '' defaults 30575 1726867586.33353: Evaluated conditional (network_state != {}): False 30575 1726867586.33355: when evaluation is False, skipping this task 30575 1726867586.33358: _execute() done 30575 1726867586.33361: dumping result to json 30575 1726867586.33363: done dumping result, returning 30575 1726867586.33366: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000738] 30575 1726867586.33369: sending task result for task 0affcac9-a3a5-e081-a588-000000000738 30575 1726867586.33482: done sending task result for task 0affcac9-a3a5-e081-a588-000000000738 30575 1726867586.33487: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867586.33537: no more pending results, returning what we have 30575 1726867586.33542: results queue empty 30575 1726867586.33545: checking for any_errors_fatal 30575 1726867586.33551: done checking for any_errors_fatal 30575 1726867586.33551: checking for max_fail_percentage 30575 1726867586.33553: done checking for max_fail_percentage 30575 1726867586.33554: checking to see if all hosts have failed and the running result is not ok 30575 1726867586.33555: done checking to see if all hosts have failed 30575 1726867586.33556: getting the remaining hosts for this loop 30575 1726867586.33557: done getting the remaining hosts for this loop 30575 1726867586.33561: getting the next task for host managed_node3 30575 1726867586.33572: done getting next task for host managed_node3 30575 1726867586.33578: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867586.33584: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867586.33605: getting variables 30575 1726867586.33607: in VariableManager get_vars() 30575 1726867586.33644: Calling all_inventory to load vars for managed_node3 30575 1726867586.33647: Calling groups_inventory to load vars for managed_node3 30575 1726867586.33649: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867586.33661: Calling all_plugins_play to load vars for managed_node3 30575 1726867586.33665: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867586.33670: Calling groups_plugins_play to load vars for managed_node3 30575 1726867586.35728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867586.37497: done with get_vars() 30575 1726867586.37517: done getting variables 30575 1726867586.37576: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:26:26 -0400 (0:00:00.065) 0:00:21.753 ****** 30575 1726867586.37612: entering _queue_task() for managed_node3/package 30575 1726867586.38205: worker is 1 (out of 1 available) 30575 1726867586.38216: exiting _queue_task() for managed_node3/package 30575 1726867586.38230: done queuing things up, now waiting for results queue to drain 30575 1726867586.38232: waiting for pending results... 30575 1726867586.38615: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867586.38778: in run() - task 0affcac9-a3a5-e081-a588-000000000739 30575 1726867586.38791: variable 'ansible_search_path' from source: unknown 30575 1726867586.38796: variable 'ansible_search_path' from source: unknown 30575 1726867586.38879: calling self._execute() 30575 1726867586.38943: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.38947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.38959: variable 'omit' from source: magic vars 30575 1726867586.39344: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.39420: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867586.39488: variable 'network_state' from source: role '' defaults 30575 1726867586.39501: Evaluated conditional (network_state != {}): False 30575 1726867586.39504: when evaluation is False, skipping this task 30575 1726867586.39507: _execute() done 30575 1726867586.39510: dumping result to json 30575 1726867586.39512: done dumping result, returning 30575 1726867586.39526: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000739] 30575 1726867586.39529: sending task result for task 0affcac9-a3a5-e081-a588-000000000739 30575 1726867586.39729: done sending task result for task 0affcac9-a3a5-e081-a588-000000000739 30575 1726867586.39733: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867586.39774: no more pending results, returning what we have 30575 1726867586.39780: results queue empty 30575 1726867586.39781: checking for any_errors_fatal 30575 1726867586.39787: done checking for any_errors_fatal 30575 1726867586.39788: checking for max_fail_percentage 30575 1726867586.39789: done checking for max_fail_percentage 30575 1726867586.39790: checking to see if all hosts have failed and the running result is not ok 30575 1726867586.39791: done checking to see if all hosts have failed 30575 1726867586.39792: getting the remaining hosts for this loop 30575 1726867586.39793: done getting the remaining hosts for this loop 30575 1726867586.39796: getting the next task for host managed_node3 30575 1726867586.39804: done getting next task for host managed_node3 30575 1726867586.39807: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867586.39812: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867586.39831: getting variables 30575 1726867586.39833: in VariableManager get_vars() 30575 1726867586.39864: Calling all_inventory to load vars for managed_node3 30575 1726867586.39867: Calling groups_inventory to load vars for managed_node3 30575 1726867586.39869: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867586.39880: Calling all_plugins_play to load vars for managed_node3 30575 1726867586.39883: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867586.39887: Calling groups_plugins_play to load vars for managed_node3 30575 1726867586.41408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867586.42943: done with get_vars() 30575 1726867586.42963: done getting variables 30575 1726867586.43028: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:26:26 -0400 (0:00:00.054) 0:00:21.808 ****** 30575 1726867586.43062: entering _queue_task() for managed_node3/service 30575 1726867586.43344: worker is 1 (out of 1 available) 30575 1726867586.43355: exiting _queue_task() for managed_node3/service 30575 1726867586.43367: done queuing things up, now waiting for results queue to drain 30575 1726867586.43369: waiting for pending results... 30575 1726867586.43705: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867586.43793: in run() - task 0affcac9-a3a5-e081-a588-00000000073a 30575 1726867586.43984: variable 'ansible_search_path' from source: unknown 30575 1726867586.43987: variable 'ansible_search_path' from source: unknown 30575 1726867586.43991: calling self._execute() 30575 1726867586.43993: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.43996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.43998: variable 'omit' from source: magic vars 30575 1726867586.44380: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.44398: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867586.44534: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867586.44738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867586.47061: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867586.47145: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867586.47193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867586.47237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867586.47271: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867586.47358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.47399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.47433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.47478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.47504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.47557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.47588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.47621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.47669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.47692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.47744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.47773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.47817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.47855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.47928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.48081: variable 'network_connections' from source: include params 30575 1726867586.48099: variable 'interface' from source: play vars 30575 1726867586.48180: variable 'interface' from source: play vars 30575 1726867586.48266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867586.48437: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867586.48497: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867586.48536: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867586.48782: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867586.48785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867586.48787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867586.48789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.48791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867586.48793: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867586.49009: variable 'network_connections' from source: include params 30575 1726867586.49021: variable 'interface' from source: play vars 30575 1726867586.49087: variable 'interface' from source: play vars 30575 1726867586.49129: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867586.49137: when evaluation is False, skipping this task 30575 1726867586.49143: _execute() done 30575 1726867586.49149: dumping result to json 30575 1726867586.49155: done dumping result, returning 30575 1726867586.49165: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000073a] 30575 1726867586.49174: sending task result for task 0affcac9-a3a5-e081-a588-00000000073a skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867586.49485: no more pending results, returning what we have 30575 1726867586.49488: results queue empty 30575 1726867586.49489: checking for any_errors_fatal 30575 1726867586.49495: done checking for any_errors_fatal 30575 1726867586.49496: checking for max_fail_percentage 30575 1726867586.49498: done checking for max_fail_percentage 30575 1726867586.49499: checking to see if all hosts have failed and the running result is not ok 30575 1726867586.49500: done checking to see if all hosts have failed 30575 1726867586.49501: getting the remaining hosts for this loop 30575 1726867586.49503: done getting the remaining hosts for this loop 30575 1726867586.49506: getting the next task for host managed_node3 30575 1726867586.49514: done getting next task for host managed_node3 30575 1726867586.49518: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867586.49525: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867586.49548: getting variables 30575 1726867586.49550: in VariableManager get_vars() 30575 1726867586.49587: Calling all_inventory to load vars for managed_node3 30575 1726867586.49589: Calling groups_inventory to load vars for managed_node3 30575 1726867586.49592: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867586.49603: Calling all_plugins_play to load vars for managed_node3 30575 1726867586.49606: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867586.49610: Calling groups_plugins_play to load vars for managed_node3 30575 1726867586.50223: done sending task result for task 0affcac9-a3a5-e081-a588-00000000073a 30575 1726867586.50227: WORKER PROCESS EXITING 30575 1726867586.51171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867586.52836: done with get_vars() 30575 1726867586.52863: done getting variables 30575 1726867586.52922: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:26:26 -0400 (0:00:00.098) 0:00:21.907 ****** 30575 1726867586.52954: entering _queue_task() for managed_node3/service 30575 1726867586.53283: worker is 1 (out of 1 available) 30575 1726867586.53398: exiting _queue_task() for managed_node3/service 30575 1726867586.53409: done queuing things up, now waiting for results queue to drain 30575 1726867586.53411: waiting for pending results... 30575 1726867586.53680: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867586.53760: in run() - task 0affcac9-a3a5-e081-a588-00000000073b 30575 1726867586.53783: variable 'ansible_search_path' from source: unknown 30575 1726867586.53794: variable 'ansible_search_path' from source: unknown 30575 1726867586.53840: calling self._execute() 30575 1726867586.53947: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.54053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.54057: variable 'omit' from source: magic vars 30575 1726867586.54363: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.54387: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867586.54610: variable 'network_provider' from source: set_fact 30575 1726867586.54683: variable 'network_state' from source: role '' defaults 30575 1726867586.54687: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867586.54689: variable 'omit' from source: magic vars 30575 1726867586.54709: variable 'omit' from source: magic vars 30575 1726867586.54744: variable 'network_service_name' from source: role '' defaults 30575 1726867586.54818: variable 'network_service_name' from source: role '' defaults 30575 1726867586.54938: variable '__network_provider_setup' from source: role '' defaults 30575 1726867586.54951: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867586.55016: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867586.55039: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867586.55106: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867586.55347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867586.59274: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867586.59281: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867586.59311: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867586.59352: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867586.59419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867586.59559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.59624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.59664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.59715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.59737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.60039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.60119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.60355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.60359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.60362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.61383: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867586.61594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.61624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.61654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.61700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.61799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.61997: variable 'ansible_python' from source: facts 30575 1726867586.62019: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867586.62204: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867586.62584: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867586.62616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.62647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.62679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.62822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.62844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.62898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867586.63009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867586.63038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.63083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867586.63482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867586.63486: variable 'network_connections' from source: include params 30575 1726867586.63488: variable 'interface' from source: play vars 30575 1726867586.63609: variable 'interface' from source: play vars 30575 1726867586.63719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867586.64176: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867586.64331: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867586.64380: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867586.64423: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867586.64641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867586.64675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867586.64713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867586.64817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867586.64869: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867586.65483: variable 'network_connections' from source: include params 30575 1726867586.65497: variable 'interface' from source: play vars 30575 1726867586.65571: variable 'interface' from source: play vars 30575 1726867586.65729: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867586.65911: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867586.66683: variable 'network_connections' from source: include params 30575 1726867586.66686: variable 'interface' from source: play vars 30575 1726867586.66688: variable 'interface' from source: play vars 30575 1726867586.66711: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867586.66789: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867586.67474: variable 'network_connections' from source: include params 30575 1726867586.67489: variable 'interface' from source: play vars 30575 1726867586.67560: variable 'interface' from source: play vars 30575 1726867586.67743: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867586.67807: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867586.67891: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867586.67953: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867586.68450: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867586.68942: variable 'network_connections' from source: include params 30575 1726867586.68953: variable 'interface' from source: play vars 30575 1726867586.69020: variable 'interface' from source: play vars 30575 1726867586.69035: variable 'ansible_distribution' from source: facts 30575 1726867586.69044: variable '__network_rh_distros' from source: role '' defaults 30575 1726867586.69055: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.69094: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867586.69274: variable 'ansible_distribution' from source: facts 30575 1726867586.69288: variable '__network_rh_distros' from source: role '' defaults 30575 1726867586.69299: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.69313: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867586.69495: variable 'ansible_distribution' from source: facts 30575 1726867586.69504: variable '__network_rh_distros' from source: role '' defaults 30575 1726867586.69514: variable 'ansible_distribution_major_version' from source: facts 30575 1726867586.69564: variable 'network_provider' from source: set_fact 30575 1726867586.69596: variable 'omit' from source: magic vars 30575 1726867586.69627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867586.69663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867586.69690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867586.69713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867586.69729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867586.69772: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867586.69783: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.69792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.69976: Set connection var ansible_pipelining to False 30575 1726867586.69981: Set connection var ansible_shell_type to sh 30575 1726867586.69983: Set connection var ansible_shell_executable to /bin/sh 30575 1726867586.69991: Set connection var ansible_timeout to 10 30575 1726867586.69993: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867586.69995: Set connection var ansible_connection to ssh 30575 1726867586.69997: variable 'ansible_shell_executable' from source: unknown 30575 1726867586.69999: variable 'ansible_connection' from source: unknown 30575 1726867586.70001: variable 'ansible_module_compression' from source: unknown 30575 1726867586.70003: variable 'ansible_shell_type' from source: unknown 30575 1726867586.70005: variable 'ansible_shell_executable' from source: unknown 30575 1726867586.70007: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867586.70015: variable 'ansible_pipelining' from source: unknown 30575 1726867586.70023: variable 'ansible_timeout' from source: unknown 30575 1726867586.70031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867586.70147: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867586.70166: variable 'omit' from source: magic vars 30575 1726867586.70179: starting attempt loop 30575 1726867586.70191: running the handler 30575 1726867586.70301: variable 'ansible_facts' from source: unknown 30575 1726867586.71030: _low_level_execute_command(): starting 30575 1726867586.71042: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867586.71853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867586.71888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867586.71980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867586.73671: stdout chunk (state=3): >>>/root <<< 30575 1726867586.73921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867586.74044: stderr chunk (state=3): >>><<< 30575 1726867586.74047: stdout chunk (state=3): >>><<< 30575 1726867586.74050: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867586.74052: _low_level_execute_command(): starting 30575 1726867586.74055: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643 `" && echo ansible-tmp-1726867586.7397037-31581-82306465866643="` echo /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643 `" ) && sleep 0' 30575 1726867586.74793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867586.74834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867586.74851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867586.74872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867586.74950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867586.76915: stdout chunk (state=3): >>>ansible-tmp-1726867586.7397037-31581-82306465866643=/root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643 <<< 30575 1726867586.76997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867586.77090: stderr chunk (state=3): >>><<< 30575 1726867586.77094: stdout chunk (state=3): >>><<< 30575 1726867586.77212: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867586.7397037-31581-82306465866643=/root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867586.77216: variable 'ansible_module_compression' from source: unknown 30575 1726867586.77263: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867586.77350: variable 'ansible_facts' from source: unknown 30575 1726867586.77593: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/AnsiballZ_systemd.py 30575 1726867586.77807: Sending initial data 30575 1726867586.77810: Sent initial data (155 bytes) 30575 1726867586.78356: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867586.78462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867586.78494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867586.78508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867586.78589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867586.80324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867586.80375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867586.80456: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpasvblavq /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/AnsiballZ_systemd.py <<< 30575 1726867586.80460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/AnsiballZ_systemd.py" <<< 30575 1726867586.80503: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpasvblavq" to remote "/root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/AnsiballZ_systemd.py" <<< 30575 1726867586.82115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867586.82260: stderr chunk (state=3): >>><<< 30575 1726867586.82263: stdout chunk (state=3): >>><<< 30575 1726867586.82265: done transferring module to remote 30575 1726867586.82271: _low_level_execute_command(): starting 30575 1726867586.82273: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/ /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/AnsiballZ_systemd.py && sleep 0' 30575 1726867586.83251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867586.83255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867586.83257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867586.83259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867586.83276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867586.83296: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867586.83407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867586.83537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867586.83590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867586.83593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867586.83650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867586.85625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867586.85628: stdout chunk (state=3): >>><<< 30575 1726867586.85630: stderr chunk (state=3): >>><<< 30575 1726867586.85632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867586.85634: _low_level_execute_command(): starting 30575 1726867586.85637: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/AnsiballZ_systemd.py && sleep 0' 30575 1726867586.86206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867586.86231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867586.86310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867587.16025: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314499584", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1780844000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30575 1726867587.16057: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867587.17999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867587.18083: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867587.18086: stdout chunk (state=3): >>><<< 30575 1726867587.18089: stderr chunk (state=3): >>><<< 30575 1726867587.18093: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314499584", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1780844000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867587.18265: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867587.18300: _low_level_execute_command(): starting 30575 1726867587.18309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867586.7397037-31581-82306465866643/ > /dev/null 2>&1 && sleep 0' 30575 1726867587.18991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.19009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867587.19073: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.19127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867587.19153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867587.19188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867587.19259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867587.21482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867587.21485: stdout chunk (state=3): >>><<< 30575 1726867587.21487: stderr chunk (state=3): >>><<< 30575 1726867587.21489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867587.21491: handler run complete 30575 1726867587.21493: attempt loop complete, returning result 30575 1726867587.21495: _execute() done 30575 1726867587.21496: dumping result to json 30575 1726867587.21498: done dumping result, returning 30575 1726867587.21500: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-00000000073b] 30575 1726867587.21502: sending task result for task 0affcac9-a3a5-e081-a588-00000000073b 30575 1726867587.22063: done sending task result for task 0affcac9-a3a5-e081-a588-00000000073b 30575 1726867587.22067: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867587.22130: no more pending results, returning what we have 30575 1726867587.22133: results queue empty 30575 1726867587.22134: checking for any_errors_fatal 30575 1726867587.22137: done checking for any_errors_fatal 30575 1726867587.22138: checking for max_fail_percentage 30575 1726867587.22139: done checking for max_fail_percentage 30575 1726867587.22140: checking to see if all hosts have failed and the running result is not ok 30575 1726867587.22141: done checking to see if all hosts have failed 30575 1726867587.22142: getting the remaining hosts for this loop 30575 1726867587.22143: done getting the remaining hosts for this loop 30575 1726867587.22146: getting the next task for host managed_node3 30575 1726867587.22152: done getting next task for host managed_node3 30575 1726867587.22156: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867587.22161: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867587.22171: getting variables 30575 1726867587.22172: in VariableManager get_vars() 30575 1726867587.22199: Calling all_inventory to load vars for managed_node3 30575 1726867587.22201: Calling groups_inventory to load vars for managed_node3 30575 1726867587.22203: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867587.22212: Calling all_plugins_play to load vars for managed_node3 30575 1726867587.22214: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867587.22216: Calling groups_plugins_play to load vars for managed_node3 30575 1726867587.22925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867587.24671: done with get_vars() 30575 1726867587.24790: done getting variables 30575 1726867587.24899: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:26:27 -0400 (0:00:00.719) 0:00:22.626 ****** 30575 1726867587.24939: entering _queue_task() for managed_node3/service 30575 1726867587.25247: worker is 1 (out of 1 available) 30575 1726867587.25260: exiting _queue_task() for managed_node3/service 30575 1726867587.25272: done queuing things up, now waiting for results queue to drain 30575 1726867587.25274: waiting for pending results... 30575 1726867587.25585: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867587.25680: in run() - task 0affcac9-a3a5-e081-a588-00000000073c 30575 1726867587.25692: variable 'ansible_search_path' from source: unknown 30575 1726867587.25695: variable 'ansible_search_path' from source: unknown 30575 1726867587.25732: calling self._execute() 30575 1726867587.25823: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867587.25831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867587.25841: variable 'omit' from source: magic vars 30575 1726867587.26206: variable 'ansible_distribution_major_version' from source: facts 30575 1726867587.26220: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867587.26330: variable 'network_provider' from source: set_fact 30575 1726867587.26334: Evaluated conditional (network_provider == "nm"): True 30575 1726867587.26424: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867587.26516: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867587.26761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867587.28703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867587.28740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867587.28774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867587.28811: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867587.28835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867587.29052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867587.29187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867587.29191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867587.29193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867587.29196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867587.29198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867587.29219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867587.29244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867587.29282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867587.29294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867587.29336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867587.29361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867587.29394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867587.29433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867587.29448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867587.29607: variable 'network_connections' from source: include params 30575 1726867587.29610: variable 'interface' from source: play vars 30575 1726867587.29660: variable 'interface' from source: play vars 30575 1726867587.29859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867587.30484: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867587.30570: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867587.30573: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867587.30580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867587.30624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867587.30711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867587.30714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867587.30716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867587.30744: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867587.31157: variable 'network_connections' from source: include params 30575 1726867587.31160: variable 'interface' from source: play vars 30575 1726867587.31263: variable 'interface' from source: play vars 30575 1726867587.31297: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867587.31301: when evaluation is False, skipping this task 30575 1726867587.31303: _execute() done 30575 1726867587.31306: dumping result to json 30575 1726867587.31308: done dumping result, returning 30575 1726867587.31318: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-00000000073c] 30575 1726867587.31328: sending task result for task 0affcac9-a3a5-e081-a588-00000000073c skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867587.31548: no more pending results, returning what we have 30575 1726867587.31551: results queue empty 30575 1726867587.31552: checking for any_errors_fatal 30575 1726867587.31570: done checking for any_errors_fatal 30575 1726867587.31571: checking for max_fail_percentage 30575 1726867587.31572: done checking for max_fail_percentage 30575 1726867587.31573: checking to see if all hosts have failed and the running result is not ok 30575 1726867587.31574: done checking to see if all hosts have failed 30575 1726867587.31574: getting the remaining hosts for this loop 30575 1726867587.31575: done getting the remaining hosts for this loop 30575 1726867587.31580: getting the next task for host managed_node3 30575 1726867587.31587: done getting next task for host managed_node3 30575 1726867587.31590: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867587.31594: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867587.31609: getting variables 30575 1726867587.31610: in VariableManager get_vars() 30575 1726867587.31642: Calling all_inventory to load vars for managed_node3 30575 1726867587.31644: Calling groups_inventory to load vars for managed_node3 30575 1726867587.31646: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867587.31654: Calling all_plugins_play to load vars for managed_node3 30575 1726867587.31656: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867587.31658: Calling groups_plugins_play to load vars for managed_node3 30575 1726867587.32190: done sending task result for task 0affcac9-a3a5-e081-a588-00000000073c 30575 1726867587.32194: WORKER PROCESS EXITING 30575 1726867587.33394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867587.36395: done with get_vars() 30575 1726867587.36418: done getting variables 30575 1726867587.36690: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:26:27 -0400 (0:00:00.117) 0:00:22.744 ****** 30575 1726867587.36723: entering _queue_task() for managed_node3/service 30575 1726867587.37272: worker is 1 (out of 1 available) 30575 1726867587.37311: exiting _queue_task() for managed_node3/service 30575 1726867587.37345: done queuing things up, now waiting for results queue to drain 30575 1726867587.37347: waiting for pending results... 30575 1726867587.37797: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867587.37802: in run() - task 0affcac9-a3a5-e081-a588-00000000073d 30575 1726867587.37806: variable 'ansible_search_path' from source: unknown 30575 1726867587.37809: variable 'ansible_search_path' from source: unknown 30575 1726867587.37812: calling self._execute() 30575 1726867587.37907: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867587.37911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867587.37924: variable 'omit' from source: magic vars 30575 1726867587.38300: variable 'ansible_distribution_major_version' from source: facts 30575 1726867587.38312: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867587.38476: variable 'network_provider' from source: set_fact 30575 1726867587.38482: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867587.38484: when evaluation is False, skipping this task 30575 1726867587.38487: _execute() done 30575 1726867587.38489: dumping result to json 30575 1726867587.38491: done dumping result, returning 30575 1726867587.38494: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-00000000073d] 30575 1726867587.38496: sending task result for task 0affcac9-a3a5-e081-a588-00000000073d 30575 1726867587.38556: done sending task result for task 0affcac9-a3a5-e081-a588-00000000073d 30575 1726867587.38559: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867587.38620: no more pending results, returning what we have 30575 1726867587.38624: results queue empty 30575 1726867587.38624: checking for any_errors_fatal 30575 1726867587.38632: done checking for any_errors_fatal 30575 1726867587.38632: checking for max_fail_percentage 30575 1726867587.38634: done checking for max_fail_percentage 30575 1726867587.38635: checking to see if all hosts have failed and the running result is not ok 30575 1726867587.38636: done checking to see if all hosts have failed 30575 1726867587.38637: getting the remaining hosts for this loop 30575 1726867587.38639: done getting the remaining hosts for this loop 30575 1726867587.38642: getting the next task for host managed_node3 30575 1726867587.38651: done getting next task for host managed_node3 30575 1726867587.38654: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867587.38659: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867587.38676: getting variables 30575 1726867587.38680: in VariableManager get_vars() 30575 1726867587.38710: Calling all_inventory to load vars for managed_node3 30575 1726867587.38712: Calling groups_inventory to load vars for managed_node3 30575 1726867587.38714: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867587.38722: Calling all_plugins_play to load vars for managed_node3 30575 1726867587.38725: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867587.38727: Calling groups_plugins_play to load vars for managed_node3 30575 1726867587.40963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867587.42628: done with get_vars() 30575 1726867587.42649: done getting variables 30575 1726867587.42711: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:26:27 -0400 (0:00:00.060) 0:00:22.805 ****** 30575 1726867587.42748: entering _queue_task() for managed_node3/copy 30575 1726867587.43117: worker is 1 (out of 1 available) 30575 1726867587.43251: exiting _queue_task() for managed_node3/copy 30575 1726867587.43263: done queuing things up, now waiting for results queue to drain 30575 1726867587.43265: waiting for pending results... 30575 1726867587.43719: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867587.43728: in run() - task 0affcac9-a3a5-e081-a588-00000000073e 30575 1726867587.43732: variable 'ansible_search_path' from source: unknown 30575 1726867587.43735: variable 'ansible_search_path' from source: unknown 30575 1726867587.43781: calling self._execute() 30575 1726867587.43870: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867587.43875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867587.43889: variable 'omit' from source: magic vars 30575 1726867587.44335: variable 'ansible_distribution_major_version' from source: facts 30575 1726867587.44343: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867587.44526: variable 'network_provider' from source: set_fact 30575 1726867587.44530: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867587.44534: when evaluation is False, skipping this task 30575 1726867587.44537: _execute() done 30575 1726867587.44540: dumping result to json 30575 1726867587.44542: done dumping result, returning 30575 1726867587.44548: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-00000000073e] 30575 1726867587.44551: sending task result for task 0affcac9-a3a5-e081-a588-00000000073e 30575 1726867587.44647: done sending task result for task 0affcac9-a3a5-e081-a588-00000000073e 30575 1726867587.44650: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867587.44726: no more pending results, returning what we have 30575 1726867587.44730: results queue empty 30575 1726867587.44731: checking for any_errors_fatal 30575 1726867587.44737: done checking for any_errors_fatal 30575 1726867587.44738: checking for max_fail_percentage 30575 1726867587.44740: done checking for max_fail_percentage 30575 1726867587.44741: checking to see if all hosts have failed and the running result is not ok 30575 1726867587.44746: done checking to see if all hosts have failed 30575 1726867587.44747: getting the remaining hosts for this loop 30575 1726867587.44749: done getting the remaining hosts for this loop 30575 1726867587.44755: getting the next task for host managed_node3 30575 1726867587.44767: done getting next task for host managed_node3 30575 1726867587.44781: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867587.44786: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867587.44810: getting variables 30575 1726867587.44813: in VariableManager get_vars() 30575 1726867587.44859: Calling all_inventory to load vars for managed_node3 30575 1726867587.44861: Calling groups_inventory to load vars for managed_node3 30575 1726867587.44864: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867587.44989: Calling all_plugins_play to load vars for managed_node3 30575 1726867587.44993: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867587.44997: Calling groups_plugins_play to load vars for managed_node3 30575 1726867587.46846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867587.48621: done with get_vars() 30575 1726867587.48647: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:26:27 -0400 (0:00:00.059) 0:00:22.864 ****** 30575 1726867587.48741: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867587.49099: worker is 1 (out of 1 available) 30575 1726867587.49112: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867587.49128: done queuing things up, now waiting for results queue to drain 30575 1726867587.49130: waiting for pending results... 30575 1726867587.49542: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867587.49685: in run() - task 0affcac9-a3a5-e081-a588-00000000073f 30575 1726867587.49689: variable 'ansible_search_path' from source: unknown 30575 1726867587.49692: variable 'ansible_search_path' from source: unknown 30575 1726867587.49695: calling self._execute() 30575 1726867587.49733: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867587.49744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867587.49759: variable 'omit' from source: magic vars 30575 1726867587.50151: variable 'ansible_distribution_major_version' from source: facts 30575 1726867587.50169: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867587.50184: variable 'omit' from source: magic vars 30575 1726867587.50262: variable 'omit' from source: magic vars 30575 1726867587.50434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867587.52931: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867587.53010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867587.53059: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867587.53114: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867587.53162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867587.53252: variable 'network_provider' from source: set_fact 30575 1726867587.53398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867587.53440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867587.53470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867587.53526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867587.53582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867587.53618: variable 'omit' from source: magic vars 30575 1726867587.53731: variable 'omit' from source: magic vars 30575 1726867587.53837: variable 'network_connections' from source: include params 30575 1726867587.53863: variable 'interface' from source: play vars 30575 1726867587.53929: variable 'interface' from source: play vars 30575 1726867587.54180: variable 'omit' from source: magic vars 30575 1726867587.54183: variable '__lsr_ansible_managed' from source: task vars 30575 1726867587.54186: variable '__lsr_ansible_managed' from source: task vars 30575 1726867587.54366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867587.54633: Loaded config def from plugin (lookup/template) 30575 1726867587.54649: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867587.54689: File lookup term: get_ansible_managed.j2 30575 1726867587.54721: variable 'ansible_search_path' from source: unknown 30575 1726867587.54725: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867587.54736: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867587.54764: variable 'ansible_search_path' from source: unknown 30575 1726867587.62261: variable 'ansible_managed' from source: unknown 30575 1726867587.62393: variable 'omit' from source: magic vars 30575 1726867587.62429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867587.62445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867587.62483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867587.62511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867587.62514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867587.62603: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867587.62606: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867587.62609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867587.62657: Set connection var ansible_pipelining to False 30575 1726867587.62660: Set connection var ansible_shell_type to sh 30575 1726867587.62666: Set connection var ansible_shell_executable to /bin/sh 30575 1726867587.62671: Set connection var ansible_timeout to 10 30575 1726867587.62685: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867587.62689: Set connection var ansible_connection to ssh 30575 1726867587.62712: variable 'ansible_shell_executable' from source: unknown 30575 1726867587.62715: variable 'ansible_connection' from source: unknown 30575 1726867587.62717: variable 'ansible_module_compression' from source: unknown 30575 1726867587.62719: variable 'ansible_shell_type' from source: unknown 30575 1726867587.62721: variable 'ansible_shell_executable' from source: unknown 30575 1726867587.62726: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867587.62729: variable 'ansible_pipelining' from source: unknown 30575 1726867587.62731: variable 'ansible_timeout' from source: unknown 30575 1726867587.62733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867587.62884: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867587.62895: variable 'omit' from source: magic vars 30575 1726867587.62898: starting attempt loop 30575 1726867587.62900: running the handler 30575 1726867587.62911: _low_level_execute_command(): starting 30575 1726867587.62917: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867587.63551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867587.63555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867587.63558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.63573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867587.63615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867587.63680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867587.65375: stdout chunk (state=3): >>>/root <<< 30575 1726867587.65497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867587.65523: stderr chunk (state=3): >>><<< 30575 1726867587.65530: stdout chunk (state=3): >>><<< 30575 1726867587.65547: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867587.65556: _low_level_execute_command(): starting 30575 1726867587.65560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828 `" && echo ansible-tmp-1726867587.6554637-31621-19444908034828="` echo /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828 `" ) && sleep 0' 30575 1726867587.66082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867587.66087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867587.66090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.66095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.66112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867587.66127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867587.66181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867587.68110: stdout chunk (state=3): >>>ansible-tmp-1726867587.6554637-31621-19444908034828=/root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828 <<< 30575 1726867587.68227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867587.68254: stderr chunk (state=3): >>><<< 30575 1726867587.68257: stdout chunk (state=3): >>><<< 30575 1726867587.68271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867587.6554637-31621-19444908034828=/root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867587.68307: variable 'ansible_module_compression' from source: unknown 30575 1726867587.68342: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867587.68378: variable 'ansible_facts' from source: unknown 30575 1726867587.68469: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/AnsiballZ_network_connections.py 30575 1726867587.68560: Sending initial data 30575 1726867587.68564: Sent initial data (167 bytes) 30575 1726867587.69140: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.69147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867587.69149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867587.69151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.69217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867587.69222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867587.69225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867587.69279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867587.70850: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867587.70854: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867587.70893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867587.70939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp51arz_si /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/AnsiballZ_network_connections.py <<< 30575 1726867587.70947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/AnsiballZ_network_connections.py" <<< 30575 1726867587.70983: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp51arz_si" to remote "/root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/AnsiballZ_network_connections.py" <<< 30575 1726867587.72026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867587.72181: stderr chunk (state=3): >>><<< 30575 1726867587.72184: stdout chunk (state=3): >>><<< 30575 1726867587.72186: done transferring module to remote 30575 1726867587.72188: _low_level_execute_command(): starting 30575 1726867587.72190: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/ /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/AnsiballZ_network_connections.py && sleep 0' 30575 1726867587.72739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867587.72751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867587.72763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867587.72775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867587.72798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867587.72807: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867587.72818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.72833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867587.72842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867587.72850: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867587.72913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.72941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867587.72961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867587.72971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867587.73038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867587.74817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867587.74844: stderr chunk (state=3): >>><<< 30575 1726867587.74847: stdout chunk (state=3): >>><<< 30575 1726867587.74858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867587.74861: _low_level_execute_command(): starting 30575 1726867587.74865: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/AnsiballZ_network_connections.py && sleep 0' 30575 1726867587.75260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867587.75263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867587.75265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.75267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867587.75273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867587.75275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867587.75318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867587.75322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867587.75373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.01284: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867588.03062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867588.03066: stdout chunk (state=3): >>><<< 30575 1726867588.03072: stderr chunk (state=3): >>><<< 30575 1726867588.03091: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867588.03122: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867588.03130: _low_level_execute_command(): starting 30575 1726867588.03135: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867587.6554637-31621-19444908034828/ > /dev/null 2>&1 && sleep 0' 30575 1726867588.03555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867588.03559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867588.03589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867588.03592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867588.03594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867588.03596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867588.03652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867588.03659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867588.03661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867588.03705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.05516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867588.05544: stderr chunk (state=3): >>><<< 30575 1726867588.05547: stdout chunk (state=3): >>><<< 30575 1726867588.05559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867588.05567: handler run complete 30575 1726867588.05590: attempt loop complete, returning result 30575 1726867588.05593: _execute() done 30575 1726867588.05595: dumping result to json 30575 1726867588.05600: done dumping result, returning 30575 1726867588.05608: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-00000000073f] 30575 1726867588.05612: sending task result for task 0affcac9-a3a5-e081-a588-00000000073f 30575 1726867588.05711: done sending task result for task 0affcac9-a3a5-e081-a588-00000000073f 30575 1726867588.05713: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58 30575 1726867588.05834: no more pending results, returning what we have 30575 1726867588.05837: results queue empty 30575 1726867588.05838: checking for any_errors_fatal 30575 1726867588.05843: done checking for any_errors_fatal 30575 1726867588.05843: checking for max_fail_percentage 30575 1726867588.05845: done checking for max_fail_percentage 30575 1726867588.05846: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.05846: done checking to see if all hosts have failed 30575 1726867588.05847: getting the remaining hosts for this loop 30575 1726867588.05849: done getting the remaining hosts for this loop 30575 1726867588.05852: getting the next task for host managed_node3 30575 1726867588.05859: done getting next task for host managed_node3 30575 1726867588.05862: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867588.05866: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.05876: getting variables 30575 1726867588.05879: in VariableManager get_vars() 30575 1726867588.05911: Calling all_inventory to load vars for managed_node3 30575 1726867588.05913: Calling groups_inventory to load vars for managed_node3 30575 1726867588.05915: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.05925: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.05928: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.05930: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.06718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.07669: done with get_vars() 30575 1726867588.07686: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:26:28 -0400 (0:00:00.590) 0:00:23.455 ****** 30575 1726867588.07749: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867588.07965: worker is 1 (out of 1 available) 30575 1726867588.07980: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867588.07993: done queuing things up, now waiting for results queue to drain 30575 1726867588.07995: waiting for pending results... 30575 1726867588.08173: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867588.08262: in run() - task 0affcac9-a3a5-e081-a588-000000000740 30575 1726867588.08274: variable 'ansible_search_path' from source: unknown 30575 1726867588.08279: variable 'ansible_search_path' from source: unknown 30575 1726867588.08307: calling self._execute() 30575 1726867588.08380: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.08384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.08393: variable 'omit' from source: magic vars 30575 1726867588.08661: variable 'ansible_distribution_major_version' from source: facts 30575 1726867588.08665: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867588.08746: variable 'network_state' from source: role '' defaults 30575 1726867588.08756: Evaluated conditional (network_state != {}): False 30575 1726867588.08759: when evaluation is False, skipping this task 30575 1726867588.08764: _execute() done 30575 1726867588.08766: dumping result to json 30575 1726867588.08769: done dumping result, returning 30575 1726867588.08779: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000000740] 30575 1726867588.08782: sending task result for task 0affcac9-a3a5-e081-a588-000000000740 30575 1726867588.08911: done sending task result for task 0affcac9-a3a5-e081-a588-000000000740 30575 1726867588.08913: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867588.08967: no more pending results, returning what we have 30575 1726867588.08970: results queue empty 30575 1726867588.08970: checking for any_errors_fatal 30575 1726867588.09000: done checking for any_errors_fatal 30575 1726867588.09002: checking for max_fail_percentage 30575 1726867588.09003: done checking for max_fail_percentage 30575 1726867588.09004: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.09005: done checking to see if all hosts have failed 30575 1726867588.09006: getting the remaining hosts for this loop 30575 1726867588.09007: done getting the remaining hosts for this loop 30575 1726867588.09011: getting the next task for host managed_node3 30575 1726867588.09017: done getting next task for host managed_node3 30575 1726867588.09021: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867588.09031: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.09046: getting variables 30575 1726867588.09047: in VariableManager get_vars() 30575 1726867588.09074: Calling all_inventory to load vars for managed_node3 30575 1726867588.09076: Calling groups_inventory to load vars for managed_node3 30575 1726867588.09082: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.09089: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.09091: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.09094: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.10154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.11640: done with get_vars() 30575 1726867588.11660: done getting variables 30575 1726867588.11755: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:26:28 -0400 (0:00:00.040) 0:00:23.495 ****** 30575 1726867588.11792: entering _queue_task() for managed_node3/debug 30575 1726867588.11993: worker is 1 (out of 1 available) 30575 1726867588.12007: exiting _queue_task() for managed_node3/debug 30575 1726867588.12018: done queuing things up, now waiting for results queue to drain 30575 1726867588.12020: waiting for pending results... 30575 1726867588.12239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867588.12333: in run() - task 0affcac9-a3a5-e081-a588-000000000741 30575 1726867588.12583: variable 'ansible_search_path' from source: unknown 30575 1726867588.12592: variable 'ansible_search_path' from source: unknown 30575 1726867588.12595: calling self._execute() 30575 1726867588.12598: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.12601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.12603: variable 'omit' from source: magic vars 30575 1726867588.12848: variable 'ansible_distribution_major_version' from source: facts 30575 1726867588.12863: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867588.12874: variable 'omit' from source: magic vars 30575 1726867588.12937: variable 'omit' from source: magic vars 30575 1726867588.12971: variable 'omit' from source: magic vars 30575 1726867588.13013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867588.13049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867588.13070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867588.13093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.13109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.13141: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867588.13149: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.13156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.13257: Set connection var ansible_pipelining to False 30575 1726867588.13265: Set connection var ansible_shell_type to sh 30575 1726867588.13276: Set connection var ansible_shell_executable to /bin/sh 30575 1726867588.13288: Set connection var ansible_timeout to 10 30575 1726867588.13297: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867588.13307: Set connection var ansible_connection to ssh 30575 1726867588.13332: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.13339: variable 'ansible_connection' from source: unknown 30575 1726867588.13345: variable 'ansible_module_compression' from source: unknown 30575 1726867588.13351: variable 'ansible_shell_type' from source: unknown 30575 1726867588.13356: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.13362: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.13368: variable 'ansible_pipelining' from source: unknown 30575 1726867588.13374: variable 'ansible_timeout' from source: unknown 30575 1726867588.13384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.13507: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867588.13522: variable 'omit' from source: magic vars 30575 1726867588.13532: starting attempt loop 30575 1726867588.13538: running the handler 30575 1726867588.13653: variable '__network_connections_result' from source: set_fact 30575 1726867588.13706: handler run complete 30575 1726867588.13727: attempt loop complete, returning result 30575 1726867588.13734: _execute() done 30575 1726867588.13740: dumping result to json 30575 1726867588.13746: done dumping result, returning 30575 1726867588.13758: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-000000000741] 30575 1726867588.13766: sending task result for task 0affcac9-a3a5-e081-a588-000000000741 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58" ] } 30575 1726867588.13957: no more pending results, returning what we have 30575 1726867588.13961: results queue empty 30575 1726867588.13962: checking for any_errors_fatal 30575 1726867588.13968: done checking for any_errors_fatal 30575 1726867588.13968: checking for max_fail_percentage 30575 1726867588.13970: done checking for max_fail_percentage 30575 1726867588.13971: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.13971: done checking to see if all hosts have failed 30575 1726867588.13972: getting the remaining hosts for this loop 30575 1726867588.13974: done getting the remaining hosts for this loop 30575 1726867588.13978: getting the next task for host managed_node3 30575 1726867588.13987: done getting next task for host managed_node3 30575 1726867588.13991: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867588.13995: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.14005: done sending task result for task 0affcac9-a3a5-e081-a588-000000000741 30575 1726867588.14008: WORKER PROCESS EXITING 30575 1726867588.14084: getting variables 30575 1726867588.14086: in VariableManager get_vars() 30575 1726867588.14114: Calling all_inventory to load vars for managed_node3 30575 1726867588.14116: Calling groups_inventory to load vars for managed_node3 30575 1726867588.14118: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.14127: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.14129: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.14131: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.15675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.16522: done with get_vars() 30575 1726867588.16537: done getting variables 30575 1726867588.16575: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:26:28 -0400 (0:00:00.048) 0:00:23.543 ****** 30575 1726867588.16604: entering _queue_task() for managed_node3/debug 30575 1726867588.16804: worker is 1 (out of 1 available) 30575 1726867588.16817: exiting _queue_task() for managed_node3/debug 30575 1726867588.16830: done queuing things up, now waiting for results queue to drain 30575 1726867588.16832: waiting for pending results... 30575 1726867588.17022: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867588.17125: in run() - task 0affcac9-a3a5-e081-a588-000000000742 30575 1726867588.17139: variable 'ansible_search_path' from source: unknown 30575 1726867588.17142: variable 'ansible_search_path' from source: unknown 30575 1726867588.17170: calling self._execute() 30575 1726867588.17256: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.17260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.17271: variable 'omit' from source: magic vars 30575 1726867588.17566: variable 'ansible_distribution_major_version' from source: facts 30575 1726867588.17596: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867588.17789: variable 'omit' from source: magic vars 30575 1726867588.17792: variable 'omit' from source: magic vars 30575 1726867588.17794: variable 'omit' from source: magic vars 30575 1726867588.17796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867588.17811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867588.17843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867588.17864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.17884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.17921: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867588.17933: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.17940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.18047: Set connection var ansible_pipelining to False 30575 1726867588.18055: Set connection var ansible_shell_type to sh 30575 1726867588.18066: Set connection var ansible_shell_executable to /bin/sh 30575 1726867588.18079: Set connection var ansible_timeout to 10 30575 1726867588.18089: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867588.18100: Set connection var ansible_connection to ssh 30575 1726867588.18134: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.18148: variable 'ansible_connection' from source: unknown 30575 1726867588.18155: variable 'ansible_module_compression' from source: unknown 30575 1726867588.18161: variable 'ansible_shell_type' from source: unknown 30575 1726867588.18168: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.18174: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.18184: variable 'ansible_pipelining' from source: unknown 30575 1726867588.18190: variable 'ansible_timeout' from source: unknown 30575 1726867588.18198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.18359: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867588.18382: variable 'omit' from source: magic vars 30575 1726867588.18399: starting attempt loop 30575 1726867588.18406: running the handler 30575 1726867588.18464: variable '__network_connections_result' from source: set_fact 30575 1726867588.18781: variable '__network_connections_result' from source: set_fact 30575 1726867588.18784: handler run complete 30575 1726867588.18899: attempt loop complete, returning result 30575 1726867588.18906: _execute() done 30575 1726867588.18911: dumping result to json 30575 1726867588.18918: done dumping result, returning 30575 1726867588.18932: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-000000000742] 30575 1726867588.18941: sending task result for task 0affcac9-a3a5-e081-a588-000000000742 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58" ] } } 30575 1726867588.19122: no more pending results, returning what we have 30575 1726867588.19126: results queue empty 30575 1726867588.19127: checking for any_errors_fatal 30575 1726867588.19134: done checking for any_errors_fatal 30575 1726867588.19135: checking for max_fail_percentage 30575 1726867588.19137: done checking for max_fail_percentage 30575 1726867588.19138: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.19139: done checking to see if all hosts have failed 30575 1726867588.19139: getting the remaining hosts for this loop 30575 1726867588.19142: done getting the remaining hosts for this loop 30575 1726867588.19145: getting the next task for host managed_node3 30575 1726867588.19155: done getting next task for host managed_node3 30575 1726867588.19159: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867588.19164: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.19176: getting variables 30575 1726867588.19179: in VariableManager get_vars() 30575 1726867588.19215: Calling all_inventory to load vars for managed_node3 30575 1726867588.19218: Calling groups_inventory to load vars for managed_node3 30575 1726867588.19226: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.19238: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.19241: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.19244: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.19891: done sending task result for task 0affcac9-a3a5-e081-a588-000000000742 30575 1726867588.19895: WORKER PROCESS EXITING 30575 1726867588.20574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.22060: done with get_vars() 30575 1726867588.22083: done getting variables 30575 1726867588.22141: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:26:28 -0400 (0:00:00.055) 0:00:23.599 ****** 30575 1726867588.22174: entering _queue_task() for managed_node3/debug 30575 1726867588.22464: worker is 1 (out of 1 available) 30575 1726867588.22681: exiting _queue_task() for managed_node3/debug 30575 1726867588.22691: done queuing things up, now waiting for results queue to drain 30575 1726867588.22693: waiting for pending results... 30575 1726867588.22820: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867588.22926: in run() - task 0affcac9-a3a5-e081-a588-000000000743 30575 1726867588.22947: variable 'ansible_search_path' from source: unknown 30575 1726867588.22956: variable 'ansible_search_path' from source: unknown 30575 1726867588.23028: calling self._execute() 30575 1726867588.23101: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.23114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.23134: variable 'omit' from source: magic vars 30575 1726867588.23570: variable 'ansible_distribution_major_version' from source: facts 30575 1726867588.23574: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867588.23659: variable 'network_state' from source: role '' defaults 30575 1726867588.23683: Evaluated conditional (network_state != {}): False 30575 1726867588.23693: when evaluation is False, skipping this task 30575 1726867588.23701: _execute() done 30575 1726867588.23709: dumping result to json 30575 1726867588.23716: done dumping result, returning 30575 1726867588.23734: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000000743] 30575 1726867588.23745: sending task result for task 0affcac9-a3a5-e081-a588-000000000743 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867588.23893: no more pending results, returning what we have 30575 1726867588.23897: results queue empty 30575 1726867588.23897: checking for any_errors_fatal 30575 1726867588.23907: done checking for any_errors_fatal 30575 1726867588.23908: checking for max_fail_percentage 30575 1726867588.23910: done checking for max_fail_percentage 30575 1726867588.23911: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.23912: done checking to see if all hosts have failed 30575 1726867588.23913: getting the remaining hosts for this loop 30575 1726867588.23915: done getting the remaining hosts for this loop 30575 1726867588.23919: getting the next task for host managed_node3 30575 1726867588.23930: done getting next task for host managed_node3 30575 1726867588.23934: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867588.23939: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.23958: getting variables 30575 1726867588.23960: in VariableManager get_vars() 30575 1726867588.23999: Calling all_inventory to load vars for managed_node3 30575 1726867588.24002: Calling groups_inventory to load vars for managed_node3 30575 1726867588.24004: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.24016: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.24020: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.24026: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.24860: done sending task result for task 0affcac9-a3a5-e081-a588-000000000743 30575 1726867588.24863: WORKER PROCESS EXITING 30575 1726867588.30179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.31889: done with get_vars() 30575 1726867588.31913: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:26:28 -0400 (0:00:00.098) 0:00:23.697 ****** 30575 1726867588.32004: entering _queue_task() for managed_node3/ping 30575 1726867588.32372: worker is 1 (out of 1 available) 30575 1726867588.32490: exiting _queue_task() for managed_node3/ping 30575 1726867588.32502: done queuing things up, now waiting for results queue to drain 30575 1726867588.32504: waiting for pending results... 30575 1726867588.32730: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867588.32986: in run() - task 0affcac9-a3a5-e081-a588-000000000744 30575 1726867588.32991: variable 'ansible_search_path' from source: unknown 30575 1726867588.32994: variable 'ansible_search_path' from source: unknown 30575 1726867588.32998: calling self._execute() 30575 1726867588.33189: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.33193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.33195: variable 'omit' from source: magic vars 30575 1726867588.33656: variable 'ansible_distribution_major_version' from source: facts 30575 1726867588.33667: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867588.33675: variable 'omit' from source: magic vars 30575 1726867588.33738: variable 'omit' from source: magic vars 30575 1726867588.33776: variable 'omit' from source: magic vars 30575 1726867588.33815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867588.33874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867588.33881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867588.33893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.33917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.33936: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867588.33939: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.33942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.34283: Set connection var ansible_pipelining to False 30575 1726867588.34287: Set connection var ansible_shell_type to sh 30575 1726867588.34290: Set connection var ansible_shell_executable to /bin/sh 30575 1726867588.34293: Set connection var ansible_timeout to 10 30575 1726867588.34295: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867588.34297: Set connection var ansible_connection to ssh 30575 1726867588.34300: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.34302: variable 'ansible_connection' from source: unknown 30575 1726867588.34304: variable 'ansible_module_compression' from source: unknown 30575 1726867588.34306: variable 'ansible_shell_type' from source: unknown 30575 1726867588.34308: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.34311: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.34495: variable 'ansible_pipelining' from source: unknown 30575 1726867588.34499: variable 'ansible_timeout' from source: unknown 30575 1726867588.34501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.34781: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867588.34789: variable 'omit' from source: magic vars 30575 1726867588.34795: starting attempt loop 30575 1726867588.34798: running the handler 30575 1726867588.34813: _low_level_execute_command(): starting 30575 1726867588.34821: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867588.35760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867588.35769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867588.35781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867588.35868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867588.35871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867588.35898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867588.35964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.37650: stdout chunk (state=3): >>>/root <<< 30575 1726867588.37819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867588.38080: stderr chunk (state=3): >>><<< 30575 1726867588.38084: stdout chunk (state=3): >>><<< 30575 1726867588.38089: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867588.38092: _low_level_execute_command(): starting 30575 1726867588.38095: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837 `" && echo ansible-tmp-1726867588.3799875-31663-105678618189837="` echo /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837 `" ) && sleep 0' 30575 1726867588.38919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867588.38935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867588.38949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867588.38967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867588.38990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867588.39093: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867588.39146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867588.39186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867588.39227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.41134: stdout chunk (state=3): >>>ansible-tmp-1726867588.3799875-31663-105678618189837=/root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837 <<< 30575 1726867588.41275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867588.41364: stdout chunk (state=3): >>><<< 30575 1726867588.41383: stderr chunk (state=3): >>><<< 30575 1726867588.41418: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867588.3799875-31663-105678618189837=/root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867588.41557: variable 'ansible_module_compression' from source: unknown 30575 1726867588.41646: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867588.41865: variable 'ansible_facts' from source: unknown 30575 1726867588.42279: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/AnsiballZ_ping.py 30575 1726867588.42611: Sending initial data 30575 1726867588.42799: Sent initial data (153 bytes) 30575 1726867588.43694: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867588.43697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867588.43701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867588.44057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867588.44398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.46009: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867588.46076: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpmz02gikf /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/AnsiballZ_ping.py <<< 30575 1726867588.46090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/AnsiballZ_ping.py" <<< 30575 1726867588.46583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpmz02gikf" to remote "/root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/AnsiballZ_ping.py" <<< 30575 1726867588.47922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867588.47946: stderr chunk (state=3): >>><<< 30575 1726867588.47949: stdout chunk (state=3): >>><<< 30575 1726867588.47970: done transferring module to remote 30575 1726867588.47982: _low_level_execute_command(): starting 30575 1726867588.47988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/ /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/AnsiballZ_ping.py && sleep 0' 30575 1726867588.49144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867588.49395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867588.49513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867588.49533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867588.49606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.51396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867588.51444: stderr chunk (state=3): >>><<< 30575 1726867588.51491: stdout chunk (state=3): >>><<< 30575 1726867588.51511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867588.51514: _low_level_execute_command(): starting 30575 1726867588.51517: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/AnsiballZ_ping.py && sleep 0' 30575 1726867588.52718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867588.52826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867588.52861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867588.52910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.67947: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867588.69394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867588.69454: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867588.69458: stdout chunk (state=3): >>><<< 30575 1726867588.69460: stderr chunk (state=3): >>><<< 30575 1726867588.69530: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867588.69685: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867588.69689: _low_level_execute_command(): starting 30575 1726867588.69691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867588.3799875-31663-105678618189837/ > /dev/null 2>&1 && sleep 0' 30575 1726867588.70898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867588.70929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867588.71197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867588.71220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867588.71378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867588.73149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867588.73200: stderr chunk (state=3): >>><<< 30575 1726867588.73206: stdout chunk (state=3): >>><<< 30575 1726867588.73229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867588.73483: handler run complete 30575 1726867588.73487: attempt loop complete, returning result 30575 1726867588.73489: _execute() done 30575 1726867588.73492: dumping result to json 30575 1726867588.73494: done dumping result, returning 30575 1726867588.73497: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-000000000744] 30575 1726867588.73499: sending task result for task 0affcac9-a3a5-e081-a588-000000000744 30575 1726867588.73567: done sending task result for task 0affcac9-a3a5-e081-a588-000000000744 30575 1726867588.73571: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867588.73641: no more pending results, returning what we have 30575 1726867588.73645: results queue empty 30575 1726867588.73645: checking for any_errors_fatal 30575 1726867588.73654: done checking for any_errors_fatal 30575 1726867588.73654: checking for max_fail_percentage 30575 1726867588.73656: done checking for max_fail_percentage 30575 1726867588.73656: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.73657: done checking to see if all hosts have failed 30575 1726867588.73658: getting the remaining hosts for this loop 30575 1726867588.73659: done getting the remaining hosts for this loop 30575 1726867588.73663: getting the next task for host managed_node3 30575 1726867588.73673: done getting next task for host managed_node3 30575 1726867588.73675: ^ task is: TASK: meta (role_complete) 30575 1726867588.73683: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.73695: getting variables 30575 1726867588.73697: in VariableManager get_vars() 30575 1726867588.73737: Calling all_inventory to load vars for managed_node3 30575 1726867588.73740: Calling groups_inventory to load vars for managed_node3 30575 1726867588.73742: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.73751: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.73753: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.73756: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.77960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.82200: done with get_vars() 30575 1726867588.82235: done getting variables 30575 1726867588.82542: done queuing things up, now waiting for results queue to drain 30575 1726867588.82545: results queue empty 30575 1726867588.82546: checking for any_errors_fatal 30575 1726867588.82548: done checking for any_errors_fatal 30575 1726867588.82549: checking for max_fail_percentage 30575 1726867588.82550: done checking for max_fail_percentage 30575 1726867588.82551: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.82552: done checking to see if all hosts have failed 30575 1726867588.82553: getting the remaining hosts for this loop 30575 1726867588.82556: done getting the remaining hosts for this loop 30575 1726867588.82559: getting the next task for host managed_node3 30575 1726867588.82565: done getting next task for host managed_node3 30575 1726867588.82567: ^ task is: TASK: Show result 30575 1726867588.82572: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.82575: getting variables 30575 1726867588.82576: in VariableManager get_vars() 30575 1726867588.82589: Calling all_inventory to load vars for managed_node3 30575 1726867588.82591: Calling groups_inventory to load vars for managed_node3 30575 1726867588.82593: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.82599: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.82601: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.82604: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.85304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.88443: done with get_vars() 30575 1726867588.88463: done getting variables 30575 1726867588.88560: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Friday 20 September 2024 17:26:28 -0400 (0:00:00.565) 0:00:24.263 ****** 30575 1726867588.88593: entering _queue_task() for managed_node3/debug 30575 1726867588.89006: worker is 1 (out of 1 available) 30575 1726867588.89019: exiting _queue_task() for managed_node3/debug 30575 1726867588.89035: done queuing things up, now waiting for results queue to drain 30575 1726867588.89036: waiting for pending results... 30575 1726867588.89796: running TaskExecutor() for managed_node3/TASK: Show result 30575 1726867588.89802: in run() - task 0affcac9-a3a5-e081-a588-0000000006b2 30575 1726867588.89806: variable 'ansible_search_path' from source: unknown 30575 1726867588.89809: variable 'ansible_search_path' from source: unknown 30575 1726867588.89812: calling self._execute() 30575 1726867588.89815: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.89817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.89820: variable 'omit' from source: magic vars 30575 1726867588.90483: variable 'ansible_distribution_major_version' from source: facts 30575 1726867588.90487: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867588.90489: variable 'omit' from source: magic vars 30575 1726867588.90492: variable 'omit' from source: magic vars 30575 1726867588.90494: variable 'omit' from source: magic vars 30575 1726867588.90496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867588.90499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867588.90502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867588.90591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.90604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867588.90639: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867588.90642: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.90645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.90905: Set connection var ansible_pipelining to False 30575 1726867588.90908: Set connection var ansible_shell_type to sh 30575 1726867588.90983: Set connection var ansible_shell_executable to /bin/sh 30575 1726867588.90987: Set connection var ansible_timeout to 10 30575 1726867588.90989: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867588.90992: Set connection var ansible_connection to ssh 30575 1726867588.91012: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.91053: variable 'ansible_connection' from source: unknown 30575 1726867588.91061: variable 'ansible_module_compression' from source: unknown 30575 1726867588.91067: variable 'ansible_shell_type' from source: unknown 30575 1726867588.91073: variable 'ansible_shell_executable' from source: unknown 30575 1726867588.91096: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.91118: variable 'ansible_pipelining' from source: unknown 30575 1726867588.91146: variable 'ansible_timeout' from source: unknown 30575 1726867588.91156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.91460: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867588.91481: variable 'omit' from source: magic vars 30575 1726867588.91491: starting attempt loop 30575 1726867588.91496: running the handler 30575 1726867588.91546: variable '__network_connections_result' from source: set_fact 30575 1726867588.91649: variable '__network_connections_result' from source: set_fact 30575 1726867588.91781: handler run complete 30575 1726867588.91820: attempt loop complete, returning result 30575 1726867588.91831: _execute() done 30575 1726867588.91838: dumping result to json 30575 1726867588.91847: done dumping result, returning 30575 1726867588.91858: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-e081-a588-0000000006b2] 30575 1726867588.91867: sending task result for task 0affcac9-a3a5-e081-a588-0000000006b2 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 4a22b8e7-8099-4ce9-82e9-2718d4e0ef58" ] } } 30575 1726867588.92089: no more pending results, returning what we have 30575 1726867588.92092: results queue empty 30575 1726867588.92093: checking for any_errors_fatal 30575 1726867588.92095: done checking for any_errors_fatal 30575 1726867588.92096: checking for max_fail_percentage 30575 1726867588.92098: done checking for max_fail_percentage 30575 1726867588.92099: checking to see if all hosts have failed and the running result is not ok 30575 1726867588.92100: done checking to see if all hosts have failed 30575 1726867588.92101: getting the remaining hosts for this loop 30575 1726867588.92102: done getting the remaining hosts for this loop 30575 1726867588.92106: getting the next task for host managed_node3 30575 1726867588.92117: done getting next task for host managed_node3 30575 1726867588.92121: ^ task is: TASK: Asserts 30575 1726867588.92126: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867588.92131: getting variables 30575 1726867588.92133: in VariableManager get_vars() 30575 1726867588.92164: Calling all_inventory to load vars for managed_node3 30575 1726867588.92166: Calling groups_inventory to load vars for managed_node3 30575 1726867588.92170: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867588.92285: Calling all_plugins_play to load vars for managed_node3 30575 1726867588.92290: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867588.92295: done sending task result for task 0affcac9-a3a5-e081-a588-0000000006b2 30575 1726867588.92298: WORKER PROCESS EXITING 30575 1726867588.92302: Calling groups_plugins_play to load vars for managed_node3 30575 1726867588.94278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867588.96259: done with get_vars() 30575 1726867588.96282: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:26:28 -0400 (0:00:00.077) 0:00:24.341 ****** 30575 1726867588.96380: entering _queue_task() for managed_node3/include_tasks 30575 1726867588.97084: worker is 1 (out of 1 available) 30575 1726867588.97100: exiting _queue_task() for managed_node3/include_tasks 30575 1726867588.97115: done queuing things up, now waiting for results queue to drain 30575 1726867588.97117: waiting for pending results... 30575 1726867588.97897: running TaskExecutor() for managed_node3/TASK: Asserts 30575 1726867588.97924: in run() - task 0affcac9-a3a5-e081-a588-0000000005b9 30575 1726867588.97947: variable 'ansible_search_path' from source: unknown 30575 1726867588.97956: variable 'ansible_search_path' from source: unknown 30575 1726867588.98210: variable 'lsr_assert' from source: include params 30575 1726867588.98620: variable 'lsr_assert' from source: include params 30575 1726867588.98700: variable 'omit' from source: magic vars 30575 1726867588.98947: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867588.98962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867588.98979: variable 'omit' from source: magic vars 30575 1726867588.99405: variable 'ansible_distribution_major_version' from source: facts 30575 1726867588.99495: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867588.99559: variable 'item' from source: unknown 30575 1726867588.99625: variable 'item' from source: unknown 30575 1726867588.99779: variable 'item' from source: unknown 30575 1726867588.99842: variable 'item' from source: unknown 30575 1726867589.00283: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.00287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.00290: variable 'omit' from source: magic vars 30575 1726867589.00582: variable 'ansible_distribution_major_version' from source: facts 30575 1726867589.00586: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867589.00589: variable 'item' from source: unknown 30575 1726867589.00629: variable 'item' from source: unknown 30575 1726867589.00661: variable 'item' from source: unknown 30575 1726867589.00788: variable 'item' from source: unknown 30575 1726867589.00937: dumping result to json 30575 1726867589.00940: done dumping result, returning 30575 1726867589.00942: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-e081-a588-0000000005b9] 30575 1726867589.01152: sending task result for task 0affcac9-a3a5-e081-a588-0000000005b9 30575 1726867589.01197: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005b9 30575 1726867589.01202: WORKER PROCESS EXITING 30575 1726867589.01230: no more pending results, returning what we have 30575 1726867589.01235: in VariableManager get_vars() 30575 1726867589.01275: Calling all_inventory to load vars for managed_node3 30575 1726867589.01280: Calling groups_inventory to load vars for managed_node3 30575 1726867589.01284: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867589.01299: Calling all_plugins_play to load vars for managed_node3 30575 1726867589.01303: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867589.01306: Calling groups_plugins_play to load vars for managed_node3 30575 1726867589.04200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867589.07123: done with get_vars() 30575 1726867589.07143: variable 'ansible_search_path' from source: unknown 30575 1726867589.07145: variable 'ansible_search_path' from source: unknown 30575 1726867589.07391: variable 'ansible_search_path' from source: unknown 30575 1726867589.07392: variable 'ansible_search_path' from source: unknown 30575 1726867589.07426: we have included files to process 30575 1726867589.07428: generating all_blocks data 30575 1726867589.07430: done generating all_blocks data 30575 1726867589.07436: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867589.07437: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867589.07440: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867589.07562: in VariableManager get_vars() 30575 1726867589.07787: done with get_vars() 30575 1726867589.07902: done processing included file 30575 1726867589.07904: iterating over new_blocks loaded from include file 30575 1726867589.07905: in VariableManager get_vars() 30575 1726867589.07920: done with get_vars() 30575 1726867589.07922: filtering new block on tags 30575 1726867589.07958: done filtering new block on tags 30575 1726867589.07961: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item=tasks/assert_device_absent.yml) 30575 1726867589.07965: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867589.07966: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867589.07969: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867589.08271: in VariableManager get_vars() 30575 1726867589.08292: done with get_vars() 30575 1726867589.08726: done processing included file 30575 1726867589.08728: iterating over new_blocks loaded from include file 30575 1726867589.08729: in VariableManager get_vars() 30575 1726867589.08742: done with get_vars() 30575 1726867589.08744: filtering new block on tags 30575 1726867589.09001: done filtering new block on tags 30575 1726867589.09003: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=tasks/assert_profile_present.yml) 30575 1726867589.09008: extending task lists for all hosts with included blocks 30575 1726867589.11109: done extending task lists 30575 1726867589.11111: done processing included files 30575 1726867589.11111: results queue empty 30575 1726867589.11112: checking for any_errors_fatal 30575 1726867589.11118: done checking for any_errors_fatal 30575 1726867589.11119: checking for max_fail_percentage 30575 1726867589.11120: done checking for max_fail_percentage 30575 1726867589.11121: checking to see if all hosts have failed and the running result is not ok 30575 1726867589.11121: done checking to see if all hosts have failed 30575 1726867589.11122: getting the remaining hosts for this loop 30575 1726867589.11126: done getting the remaining hosts for this loop 30575 1726867589.11128: getting the next task for host managed_node3 30575 1726867589.11133: done getting next task for host managed_node3 30575 1726867589.11135: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867589.11138: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867589.11147: getting variables 30575 1726867589.11148: in VariableManager get_vars() 30575 1726867589.11158: Calling all_inventory to load vars for managed_node3 30575 1726867589.11160: Calling groups_inventory to load vars for managed_node3 30575 1726867589.11162: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867589.11168: Calling all_plugins_play to load vars for managed_node3 30575 1726867589.11170: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867589.11173: Calling groups_plugins_play to load vars for managed_node3 30575 1726867589.13701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867589.16752: done with get_vars() 30575 1726867589.16982: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:26:29 -0400 (0:00:00.206) 0:00:24.548 ****** 30575 1726867589.17065: entering _queue_task() for managed_node3/include_tasks 30575 1726867589.17853: worker is 1 (out of 1 available) 30575 1726867589.17868: exiting _queue_task() for managed_node3/include_tasks 30575 1726867589.17883: done queuing things up, now waiting for results queue to drain 30575 1726867589.17884: waiting for pending results... 30575 1726867589.18566: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867589.18687: in run() - task 0affcac9-a3a5-e081-a588-0000000008a8 30575 1726867589.18703: variable 'ansible_search_path' from source: unknown 30575 1726867589.18706: variable 'ansible_search_path' from source: unknown 30575 1726867589.18881: calling self._execute() 30575 1726867589.19050: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.19053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.19064: variable 'omit' from source: magic vars 30575 1726867589.19856: variable 'ansible_distribution_major_version' from source: facts 30575 1726867589.19868: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867589.19874: _execute() done 30575 1726867589.19879: dumping result to json 30575 1726867589.19884: done dumping result, returning 30575 1726867589.19892: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-0000000008a8] 30575 1726867589.19897: sending task result for task 0affcac9-a3a5-e081-a588-0000000008a8 30575 1726867589.20126: done sending task result for task 0affcac9-a3a5-e081-a588-0000000008a8 30575 1726867589.20129: WORKER PROCESS EXITING 30575 1726867589.20196: no more pending results, returning what we have 30575 1726867589.20202: in VariableManager get_vars() 30575 1726867589.20243: Calling all_inventory to load vars for managed_node3 30575 1726867589.20246: Calling groups_inventory to load vars for managed_node3 30575 1726867589.20250: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867589.20265: Calling all_plugins_play to load vars for managed_node3 30575 1726867589.20268: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867589.20270: Calling groups_plugins_play to load vars for managed_node3 30575 1726867589.23117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867589.26314: done with get_vars() 30575 1726867589.26336: variable 'ansible_search_path' from source: unknown 30575 1726867589.26338: variable 'ansible_search_path' from source: unknown 30575 1726867589.26347: variable 'item' from source: include params 30575 1726867589.26460: variable 'item' from source: include params 30575 1726867589.26699: we have included files to process 30575 1726867589.26701: generating all_blocks data 30575 1726867589.26703: done generating all_blocks data 30575 1726867589.26704: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867589.26705: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867589.26708: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867589.27099: done processing included file 30575 1726867589.27102: iterating over new_blocks loaded from include file 30575 1726867589.27103: in VariableManager get_vars() 30575 1726867589.27120: done with get_vars() 30575 1726867589.27121: filtering new block on tags 30575 1726867589.27150: done filtering new block on tags 30575 1726867589.27153: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867589.27157: extending task lists for all hosts with included blocks 30575 1726867589.27528: done extending task lists 30575 1726867589.27529: done processing included files 30575 1726867589.27530: results queue empty 30575 1726867589.27531: checking for any_errors_fatal 30575 1726867589.27535: done checking for any_errors_fatal 30575 1726867589.27536: checking for max_fail_percentage 30575 1726867589.27537: done checking for max_fail_percentage 30575 1726867589.27537: checking to see if all hosts have failed and the running result is not ok 30575 1726867589.27538: done checking to see if all hosts have failed 30575 1726867589.27539: getting the remaining hosts for this loop 30575 1726867589.27540: done getting the remaining hosts for this loop 30575 1726867589.27543: getting the next task for host managed_node3 30575 1726867589.27547: done getting next task for host managed_node3 30575 1726867589.27550: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867589.27553: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867589.27555: getting variables 30575 1726867589.27556: in VariableManager get_vars() 30575 1726867589.27565: Calling all_inventory to load vars for managed_node3 30575 1726867589.27567: Calling groups_inventory to load vars for managed_node3 30575 1726867589.27569: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867589.27575: Calling all_plugins_play to load vars for managed_node3 30575 1726867589.27579: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867589.27582: Calling groups_plugins_play to load vars for managed_node3 30575 1726867589.29988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867589.33165: done with get_vars() 30575 1726867589.33194: done getting variables 30575 1726867589.33530: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:26:29 -0400 (0:00:00.164) 0:00:24.713 ****** 30575 1726867589.33561: entering _queue_task() for managed_node3/stat 30575 1726867589.34513: worker is 1 (out of 1 available) 30575 1726867589.34522: exiting _queue_task() for managed_node3/stat 30575 1726867589.34534: done queuing things up, now waiting for results queue to drain 30575 1726867589.34536: waiting for pending results... 30575 1726867589.34832: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867589.35186: in run() - task 0affcac9-a3a5-e081-a588-000000000928 30575 1726867589.35294: variable 'ansible_search_path' from source: unknown 30575 1726867589.35298: variable 'ansible_search_path' from source: unknown 30575 1726867589.35301: calling self._execute() 30575 1726867589.35363: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.35369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.35381: variable 'omit' from source: magic vars 30575 1726867589.36318: variable 'ansible_distribution_major_version' from source: facts 30575 1726867589.36322: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867589.36324: variable 'omit' from source: magic vars 30575 1726867589.36363: variable 'omit' from source: magic vars 30575 1726867589.36573: variable 'interface' from source: play vars 30575 1726867589.36622: variable 'omit' from source: magic vars 30575 1726867589.36667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867589.36703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867589.36723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867589.36743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867589.36767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867589.36999: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867589.37002: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.37005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.37220: Set connection var ansible_pipelining to False 30575 1726867589.37223: Set connection var ansible_shell_type to sh 30575 1726867589.37232: Set connection var ansible_shell_executable to /bin/sh 30575 1726867589.37238: Set connection var ansible_timeout to 10 30575 1726867589.37293: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867589.37296: Set connection var ansible_connection to ssh 30575 1726867589.37298: variable 'ansible_shell_executable' from source: unknown 30575 1726867589.37435: variable 'ansible_connection' from source: unknown 30575 1726867589.37439: variable 'ansible_module_compression' from source: unknown 30575 1726867589.37441: variable 'ansible_shell_type' from source: unknown 30575 1726867589.37443: variable 'ansible_shell_executable' from source: unknown 30575 1726867589.37446: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.37448: variable 'ansible_pipelining' from source: unknown 30575 1726867589.37510: variable 'ansible_timeout' from source: unknown 30575 1726867589.37513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.37816: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867589.37824: variable 'omit' from source: magic vars 30575 1726867589.37835: starting attempt loop 30575 1726867589.37838: running the handler 30575 1726867589.37850: _low_level_execute_command(): starting 30575 1726867589.37857: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867589.39285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867589.39289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867589.39840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867589.40139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867589.40220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867589.42117: stdout chunk (state=3): >>>/root <<< 30575 1726867589.42121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867589.42123: stderr chunk (state=3): >>><<< 30575 1726867589.42126: stdout chunk (state=3): >>><<< 30575 1726867589.42153: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867589.42290: _low_level_execute_command(): starting 30575 1726867589.42307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920 `" && echo ansible-tmp-1726867589.422711-31705-162842563037920="` echo /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920 `" ) && sleep 0' 30575 1726867589.43284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867589.43288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867589.43586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867589.43737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867589.43808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867589.45879: stdout chunk (state=3): >>>ansible-tmp-1726867589.422711-31705-162842563037920=/root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920 <<< 30575 1726867589.45883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867589.46002: stderr chunk (state=3): >>><<< 30575 1726867589.46006: stdout chunk (state=3): >>><<< 30575 1726867589.46008: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867589.422711-31705-162842563037920=/root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867589.46011: variable 'ansible_module_compression' from source: unknown 30575 1726867589.46159: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867589.46250: variable 'ansible_facts' from source: unknown 30575 1726867589.46476: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/AnsiballZ_stat.py 30575 1726867589.46900: Sending initial data 30575 1726867589.46903: Sent initial data (152 bytes) 30575 1726867589.47994: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867589.48113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867589.48127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867589.48139: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867589.48247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867589.48329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867589.49892: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867589.49967: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867589.50035: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_g8xcfu9 /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/AnsiballZ_stat.py <<< 30575 1726867589.50039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/AnsiballZ_stat.py" <<< 30575 1726867589.50243: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_g8xcfu9" to remote "/root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/AnsiballZ_stat.py" <<< 30575 1726867589.52243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867589.52403: stderr chunk (state=3): >>><<< 30575 1726867589.52407: stdout chunk (state=3): >>><<< 30575 1726867589.52411: done transferring module to remote 30575 1726867589.52413: _low_level_execute_command(): starting 30575 1726867589.52416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/ /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/AnsiballZ_stat.py && sleep 0' 30575 1726867589.53193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867589.53228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867589.53233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867589.53272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867589.53309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867589.55353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867589.55357: stdout chunk (state=3): >>><<< 30575 1726867589.55360: stderr chunk (state=3): >>><<< 30575 1726867589.55363: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867589.55370: _low_level_execute_command(): starting 30575 1726867589.55372: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/AnsiballZ_stat.py && sleep 0' 30575 1726867589.56385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867589.56508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867589.56512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867589.56535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867589.56569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867589.56796: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867589.56993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867589.57163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867589.72239: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867589.73423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867589.73433: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867589.73483: stderr chunk (state=3): >>><<< 30575 1726867589.73597: stdout chunk (state=3): >>><<< 30575 1726867589.73617: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867589.73652: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867589.73661: _low_level_execute_command(): starting 30575 1726867589.73666: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867589.422711-31705-162842563037920/ > /dev/null 2>&1 && sleep 0' 30575 1726867589.74985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867589.74995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867589.75004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867589.75017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867589.75032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867589.75038: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867589.75048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867589.75166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867589.75171: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867589.75250: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867589.75494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867589.75523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867589.77360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867589.77403: stderr chunk (state=3): >>><<< 30575 1726867589.77416: stdout chunk (state=3): >>><<< 30575 1726867589.77437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867589.77448: handler run complete 30575 1726867589.77472: attempt loop complete, returning result 30575 1726867589.77475: _execute() done 30575 1726867589.77479: dumping result to json 30575 1726867589.77482: done dumping result, returning 30575 1726867589.77614: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-000000000928] 30575 1726867589.77617: sending task result for task 0affcac9-a3a5-e081-a588-000000000928 30575 1726867589.77738: done sending task result for task 0affcac9-a3a5-e081-a588-000000000928 30575 1726867589.77742: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867589.77807: no more pending results, returning what we have 30575 1726867589.77811: results queue empty 30575 1726867589.77812: checking for any_errors_fatal 30575 1726867589.77814: done checking for any_errors_fatal 30575 1726867589.77814: checking for max_fail_percentage 30575 1726867589.77816: done checking for max_fail_percentage 30575 1726867589.77817: checking to see if all hosts have failed and the running result is not ok 30575 1726867589.77818: done checking to see if all hosts have failed 30575 1726867589.77819: getting the remaining hosts for this loop 30575 1726867589.77821: done getting the remaining hosts for this loop 30575 1726867589.77828: getting the next task for host managed_node3 30575 1726867589.77838: done getting next task for host managed_node3 30575 1726867589.77841: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30575 1726867589.77845: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867589.77852: getting variables 30575 1726867589.77854: in VariableManager get_vars() 30575 1726867589.77892: Calling all_inventory to load vars for managed_node3 30575 1726867589.77894: Calling groups_inventory to load vars for managed_node3 30575 1726867589.77898: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867589.77909: Calling all_plugins_play to load vars for managed_node3 30575 1726867589.77913: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867589.77916: Calling groups_plugins_play to load vars for managed_node3 30575 1726867589.81122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867589.84892: done with get_vars() 30575 1726867589.84914: done getting variables 30575 1726867589.85100: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867589.85339: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:26:29 -0400 (0:00:00.518) 0:00:25.231 ****** 30575 1726867589.85370: entering _queue_task() for managed_node3/assert 30575 1726867589.86372: worker is 1 (out of 1 available) 30575 1726867589.86384: exiting _queue_task() for managed_node3/assert 30575 1726867589.86397: done queuing things up, now waiting for results queue to drain 30575 1726867589.86399: waiting for pending results... 30575 1726867589.86994: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30575 1726867589.87155: in run() - task 0affcac9-a3a5-e081-a588-0000000008a9 30575 1726867589.87159: variable 'ansible_search_path' from source: unknown 30575 1726867589.87161: variable 'ansible_search_path' from source: unknown 30575 1726867589.87164: calling self._execute() 30575 1726867589.87411: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.87481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.87485: variable 'omit' from source: magic vars 30575 1726867589.88205: variable 'ansible_distribution_major_version' from source: facts 30575 1726867589.88220: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867589.88229: variable 'omit' from source: magic vars 30575 1726867589.88453: variable 'omit' from source: magic vars 30575 1726867589.88784: variable 'interface' from source: play vars 30575 1726867589.88788: variable 'omit' from source: magic vars 30575 1726867589.88842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867589.88880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867589.88964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867589.88996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867589.89008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867589.89043: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867589.89268: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.89272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.89393: Set connection var ansible_pipelining to False 30575 1726867589.89396: Set connection var ansible_shell_type to sh 30575 1726867589.89402: Set connection var ansible_shell_executable to /bin/sh 30575 1726867589.89408: Set connection var ansible_timeout to 10 30575 1726867589.89413: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867589.89421: Set connection var ansible_connection to ssh 30575 1726867589.89547: variable 'ansible_shell_executable' from source: unknown 30575 1726867589.89551: variable 'ansible_connection' from source: unknown 30575 1726867589.89553: variable 'ansible_module_compression' from source: unknown 30575 1726867589.89556: variable 'ansible_shell_type' from source: unknown 30575 1726867589.89558: variable 'ansible_shell_executable' from source: unknown 30575 1726867589.89560: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867589.89562: variable 'ansible_pipelining' from source: unknown 30575 1726867589.89564: variable 'ansible_timeout' from source: unknown 30575 1726867589.89566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867589.89919: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867589.89929: variable 'omit' from source: magic vars 30575 1726867589.89936: starting attempt loop 30575 1726867589.89938: running the handler 30575 1726867589.90283: variable 'interface_stat' from source: set_fact 30575 1726867589.90293: Evaluated conditional (not interface_stat.stat.exists): True 30575 1726867589.90410: handler run complete 30575 1726867589.90413: attempt loop complete, returning result 30575 1726867589.90416: _execute() done 30575 1726867589.90418: dumping result to json 30575 1726867589.90421: done dumping result, returning 30575 1726867589.90423: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcac9-a3a5-e081-a588-0000000008a9] 30575 1726867589.90425: sending task result for task 0affcac9-a3a5-e081-a588-0000000008a9 30575 1726867589.90665: done sending task result for task 0affcac9-a3a5-e081-a588-0000000008a9 30575 1726867589.90667: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867589.90735: no more pending results, returning what we have 30575 1726867589.90739: results queue empty 30575 1726867589.90739: checking for any_errors_fatal 30575 1726867589.90750: done checking for any_errors_fatal 30575 1726867589.90751: checking for max_fail_percentage 30575 1726867589.90752: done checking for max_fail_percentage 30575 1726867589.90753: checking to see if all hosts have failed and the running result is not ok 30575 1726867589.90754: done checking to see if all hosts have failed 30575 1726867589.90755: getting the remaining hosts for this loop 30575 1726867589.90756: done getting the remaining hosts for this loop 30575 1726867589.90760: getting the next task for host managed_node3 30575 1726867589.90769: done getting next task for host managed_node3 30575 1726867589.90772: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30575 1726867589.90776: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867589.90785: getting variables 30575 1726867589.90786: in VariableManager get_vars() 30575 1726867589.90822: Calling all_inventory to load vars for managed_node3 30575 1726867589.90824: Calling groups_inventory to load vars for managed_node3 30575 1726867589.90828: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867589.91062: Calling all_plugins_play to load vars for managed_node3 30575 1726867589.91067: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867589.91070: Calling groups_plugins_play to load vars for managed_node3 30575 1726867589.95276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867589.99271: done with get_vars() 30575 1726867589.99299: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:26:29 -0400 (0:00:00.141) 0:00:25.372 ****** 30575 1726867589.99517: entering _queue_task() for managed_node3/include_tasks 30575 1726867590.00348: worker is 1 (out of 1 available) 30575 1726867590.00360: exiting _queue_task() for managed_node3/include_tasks 30575 1726867590.00373: done queuing things up, now waiting for results queue to drain 30575 1726867590.00374: waiting for pending results... 30575 1726867590.01198: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30575 1726867590.01207: in run() - task 0affcac9-a3a5-e081-a588-0000000008ad 30575 1726867590.01210: variable 'ansible_search_path' from source: unknown 30575 1726867590.01213: variable 'ansible_search_path' from source: unknown 30575 1726867590.01308: calling self._execute() 30575 1726867590.01533: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.01537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.01549: variable 'omit' from source: magic vars 30575 1726867590.02419: variable 'ansible_distribution_major_version' from source: facts 30575 1726867590.02475: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867590.02489: _execute() done 30575 1726867590.02520: dumping result to json 30575 1726867590.02747: done dumping result, returning 30575 1726867590.02751: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-e081-a588-0000000008ad] 30575 1726867590.02753: sending task result for task 0affcac9-a3a5-e081-a588-0000000008ad 30575 1726867590.03086: done sending task result for task 0affcac9-a3a5-e081-a588-0000000008ad 30575 1726867590.03090: WORKER PROCESS EXITING 30575 1726867590.03118: no more pending results, returning what we have 30575 1726867590.03126: in VariableManager get_vars() 30575 1726867590.03164: Calling all_inventory to load vars for managed_node3 30575 1726867590.03167: Calling groups_inventory to load vars for managed_node3 30575 1726867590.03171: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867590.03185: Calling all_plugins_play to load vars for managed_node3 30575 1726867590.03189: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867590.03192: Calling groups_plugins_play to load vars for managed_node3 30575 1726867590.05505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867590.08729: done with get_vars() 30575 1726867590.08754: variable 'ansible_search_path' from source: unknown 30575 1726867590.08756: variable 'ansible_search_path' from source: unknown 30575 1726867590.08765: variable 'item' from source: include params 30575 1726867590.08871: variable 'item' from source: include params 30575 1726867590.08906: we have included files to process 30575 1726867590.08907: generating all_blocks data 30575 1726867590.08909: done generating all_blocks data 30575 1726867590.08913: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867590.08914: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867590.08916: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867590.10024: done processing included file 30575 1726867590.10026: iterating over new_blocks loaded from include file 30575 1726867590.10027: in VariableManager get_vars() 30575 1726867590.10110: done with get_vars() 30575 1726867590.10112: filtering new block on tags 30575 1726867590.10301: done filtering new block on tags 30575 1726867590.10305: in VariableManager get_vars() 30575 1726867590.10319: done with get_vars() 30575 1726867590.10321: filtering new block on tags 30575 1726867590.10494: done filtering new block on tags 30575 1726867590.10497: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30575 1726867590.10502: extending task lists for all hosts with included blocks 30575 1726867590.11256: done extending task lists 30575 1726867590.11257: done processing included files 30575 1726867590.11258: results queue empty 30575 1726867590.11259: checking for any_errors_fatal 30575 1726867590.11262: done checking for any_errors_fatal 30575 1726867590.11262: checking for max_fail_percentage 30575 1726867590.11263: done checking for max_fail_percentage 30575 1726867590.11264: checking to see if all hosts have failed and the running result is not ok 30575 1726867590.11265: done checking to see if all hosts have failed 30575 1726867590.11266: getting the remaining hosts for this loop 30575 1726867590.11267: done getting the remaining hosts for this loop 30575 1726867590.11270: getting the next task for host managed_node3 30575 1726867590.11275: done getting next task for host managed_node3 30575 1726867590.11278: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867590.11281: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867590.11284: getting variables 30575 1726867590.11285: in VariableManager get_vars() 30575 1726867590.11293: Calling all_inventory to load vars for managed_node3 30575 1726867590.11295: Calling groups_inventory to load vars for managed_node3 30575 1726867590.11298: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867590.11304: Calling all_plugins_play to load vars for managed_node3 30575 1726867590.11307: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867590.11309: Calling groups_plugins_play to load vars for managed_node3 30575 1726867590.13639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867590.17997: done with get_vars() 30575 1726867590.18021: done getting variables 30575 1726867590.18064: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:26:30 -0400 (0:00:00.186) 0:00:25.559 ****** 30575 1726867590.18192: entering _queue_task() for managed_node3/set_fact 30575 1726867590.19046: worker is 1 (out of 1 available) 30575 1726867590.19060: exiting _queue_task() for managed_node3/set_fact 30575 1726867590.19283: done queuing things up, now waiting for results queue to drain 30575 1726867590.19285: waiting for pending results... 30575 1726867590.19636: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867590.20084: in run() - task 0affcac9-a3a5-e081-a588-000000000946 30575 1726867590.20096: variable 'ansible_search_path' from source: unknown 30575 1726867590.20101: variable 'ansible_search_path' from source: unknown 30575 1726867590.20107: calling self._execute() 30575 1726867590.20237: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.20241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.20308: variable 'omit' from source: magic vars 30575 1726867590.21091: variable 'ansible_distribution_major_version' from source: facts 30575 1726867590.21094: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867590.21096: variable 'omit' from source: magic vars 30575 1726867590.21188: variable 'omit' from source: magic vars 30575 1726867590.21226: variable 'omit' from source: magic vars 30575 1726867590.21289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867590.21321: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867590.21405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867590.21408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867590.21512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867590.21540: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867590.21543: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.21545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.21828: Set connection var ansible_pipelining to False 30575 1726867590.21832: Set connection var ansible_shell_type to sh 30575 1726867590.21834: Set connection var ansible_shell_executable to /bin/sh 30575 1726867590.21849: Set connection var ansible_timeout to 10 30575 1726867590.21853: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867590.21905: Set connection var ansible_connection to ssh 30575 1726867590.21971: variable 'ansible_shell_executable' from source: unknown 30575 1726867590.21975: variable 'ansible_connection' from source: unknown 30575 1726867590.21981: variable 'ansible_module_compression' from source: unknown 30575 1726867590.21983: variable 'ansible_shell_type' from source: unknown 30575 1726867590.21986: variable 'ansible_shell_executable' from source: unknown 30575 1726867590.21988: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.21990: variable 'ansible_pipelining' from source: unknown 30575 1726867590.21992: variable 'ansible_timeout' from source: unknown 30575 1726867590.22128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.22154: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867590.22164: variable 'omit' from source: magic vars 30575 1726867590.22170: starting attempt loop 30575 1726867590.22173: running the handler 30575 1726867590.22189: handler run complete 30575 1726867590.22199: attempt loop complete, returning result 30575 1726867590.22201: _execute() done 30575 1726867590.22204: dumping result to json 30575 1726867590.22206: done dumping result, returning 30575 1726867590.22214: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-e081-a588-000000000946] 30575 1726867590.22219: sending task result for task 0affcac9-a3a5-e081-a588-000000000946 30575 1726867590.22540: done sending task result for task 0affcac9-a3a5-e081-a588-000000000946 30575 1726867590.22543: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30575 1726867590.22592: no more pending results, returning what we have 30575 1726867590.22595: results queue empty 30575 1726867590.22596: checking for any_errors_fatal 30575 1726867590.22598: done checking for any_errors_fatal 30575 1726867590.22599: checking for max_fail_percentage 30575 1726867590.22600: done checking for max_fail_percentage 30575 1726867590.22601: checking to see if all hosts have failed and the running result is not ok 30575 1726867590.22602: done checking to see if all hosts have failed 30575 1726867590.22602: getting the remaining hosts for this loop 30575 1726867590.22603: done getting the remaining hosts for this loop 30575 1726867590.22607: getting the next task for host managed_node3 30575 1726867590.22614: done getting next task for host managed_node3 30575 1726867590.22616: ^ task is: TASK: Stat profile file 30575 1726867590.22621: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867590.22623: getting variables 30575 1726867590.22625: in VariableManager get_vars() 30575 1726867590.22650: Calling all_inventory to load vars for managed_node3 30575 1726867590.22653: Calling groups_inventory to load vars for managed_node3 30575 1726867590.22656: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867590.22665: Calling all_plugins_play to load vars for managed_node3 30575 1726867590.22667: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867590.22670: Calling groups_plugins_play to load vars for managed_node3 30575 1726867590.24659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867590.27645: done with get_vars() 30575 1726867590.27771: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:26:30 -0400 (0:00:00.099) 0:00:25.658 ****** 30575 1726867590.28138: entering _queue_task() for managed_node3/stat 30575 1726867590.28808: worker is 1 (out of 1 available) 30575 1726867590.28822: exiting _queue_task() for managed_node3/stat 30575 1726867590.28835: done queuing things up, now waiting for results queue to drain 30575 1726867590.28836: waiting for pending results... 30575 1726867590.29598: running TaskExecutor() for managed_node3/TASK: Stat profile file 30575 1726867590.29935: in run() - task 0affcac9-a3a5-e081-a588-000000000947 30575 1726867590.30028: variable 'ansible_search_path' from source: unknown 30575 1726867590.30031: variable 'ansible_search_path' from source: unknown 30575 1726867590.30131: calling self._execute() 30575 1726867590.30433: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.30439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.30452: variable 'omit' from source: magic vars 30575 1726867590.31374: variable 'ansible_distribution_major_version' from source: facts 30575 1726867590.31508: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867590.31541: variable 'omit' from source: magic vars 30575 1726867590.31886: variable 'omit' from source: magic vars 30575 1726867590.31890: variable 'profile' from source: play vars 30575 1726867590.31892: variable 'interface' from source: play vars 30575 1726867590.32008: variable 'interface' from source: play vars 30575 1726867590.32029: variable 'omit' from source: magic vars 30575 1726867590.32382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867590.32386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867590.32389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867590.32391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867590.32393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867590.32395: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867590.32397: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.32402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.32404: Set connection var ansible_pipelining to False 30575 1726867590.32431: Set connection var ansible_shell_type to sh 30575 1726867590.32439: Set connection var ansible_shell_executable to /bin/sh 30575 1726867590.32445: Set connection var ansible_timeout to 10 30575 1726867590.32450: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867590.32458: Set connection var ansible_connection to ssh 30575 1726867590.32507: variable 'ansible_shell_executable' from source: unknown 30575 1726867590.32510: variable 'ansible_connection' from source: unknown 30575 1726867590.32516: variable 'ansible_module_compression' from source: unknown 30575 1726867590.32518: variable 'ansible_shell_type' from source: unknown 30575 1726867590.32521: variable 'ansible_shell_executable' from source: unknown 30575 1726867590.32525: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.32528: variable 'ansible_pipelining' from source: unknown 30575 1726867590.32530: variable 'ansible_timeout' from source: unknown 30575 1726867590.32532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.32788: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867590.32797: variable 'omit' from source: magic vars 30575 1726867590.32815: starting attempt loop 30575 1726867590.32818: running the handler 30575 1726867590.32831: _low_level_execute_command(): starting 30575 1726867590.32839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867590.33811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867590.33820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.33830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.33928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.35641: stdout chunk (state=3): >>>/root <<< 30575 1726867590.35894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.35899: stdout chunk (state=3): >>><<< 30575 1726867590.35913: stderr chunk (state=3): >>><<< 30575 1726867590.35955: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867590.35973: _low_level_execute_command(): starting 30575 1726867590.36288: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366 `" && echo ansible-tmp-1726867590.3595507-31743-149000786608366="` echo /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366 `" ) && sleep 0' 30575 1726867590.38386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867590.38621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.38892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.38895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.40736: stdout chunk (state=3): >>>ansible-tmp-1726867590.3595507-31743-149000786608366=/root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366 <<< 30575 1726867590.40845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.40936: stderr chunk (state=3): >>><<< 30575 1726867590.40939: stdout chunk (state=3): >>><<< 30575 1726867590.41086: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867590.3595507-31743-149000786608366=/root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867590.41089: variable 'ansible_module_compression' from source: unknown 30575 1726867590.41130: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867590.41174: variable 'ansible_facts' from source: unknown 30575 1726867590.42184: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/AnsiballZ_stat.py 30575 1726867590.42542: Sending initial data 30575 1726867590.42545: Sent initial data (153 bytes) 30575 1726867590.44075: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.44155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867590.44194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.44283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.44438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.45924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867590.45958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867590.46019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp50_5gcu6 /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/AnsiballZ_stat.py <<< 30575 1726867590.46041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/AnsiballZ_stat.py" <<< 30575 1726867590.46082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp50_5gcu6" to remote "/root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/AnsiballZ_stat.py" <<< 30575 1726867590.46906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.46909: stdout chunk (state=3): >>><<< 30575 1726867590.46916: stderr chunk (state=3): >>><<< 30575 1726867590.46957: done transferring module to remote 30575 1726867590.46967: _low_level_execute_command(): starting 30575 1726867590.46972: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/ /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/AnsiballZ_stat.py && sleep 0' 30575 1726867590.48018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.48023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.48046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867590.48159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.48193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.48281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.50075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.50081: stdout chunk (state=3): >>><<< 30575 1726867590.50087: stderr chunk (state=3): >>><<< 30575 1726867590.50121: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867590.50128: _low_level_execute_command(): starting 30575 1726867590.50131: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/AnsiballZ_stat.py && sleep 0' 30575 1726867590.50789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867590.50808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867590.50833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.50854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867590.50873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867590.50898: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867590.50965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.51027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867590.51049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.51111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.51185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.66317: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867590.67753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867590.67781: stderr chunk (state=3): >>><<< 30575 1726867590.67806: stdout chunk (state=3): >>><<< 30575 1726867590.67845: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867590.67911: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867590.67920: _low_level_execute_command(): starting 30575 1726867590.67983: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867590.3595507-31743-149000786608366/ > /dev/null 2>&1 && sleep 0' 30575 1726867590.68389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.68403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867590.68413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.68464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.68520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.70385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.70402: stdout chunk (state=3): >>><<< 30575 1726867590.70415: stderr chunk (state=3): >>><<< 30575 1726867590.70447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867590.70582: handler run complete 30575 1726867590.70590: attempt loop complete, returning result 30575 1726867590.70592: _execute() done 30575 1726867590.70597: dumping result to json 30575 1726867590.70600: done dumping result, returning 30575 1726867590.70603: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-e081-a588-000000000947] 30575 1726867590.70606: sending task result for task 0affcac9-a3a5-e081-a588-000000000947 30575 1726867590.70682: done sending task result for task 0affcac9-a3a5-e081-a588-000000000947 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867590.70739: no more pending results, returning what we have 30575 1726867590.70743: results queue empty 30575 1726867590.70743: checking for any_errors_fatal 30575 1726867590.70749: done checking for any_errors_fatal 30575 1726867590.70750: checking for max_fail_percentage 30575 1726867590.70751: done checking for max_fail_percentage 30575 1726867590.70752: checking to see if all hosts have failed and the running result is not ok 30575 1726867590.70753: done checking to see if all hosts have failed 30575 1726867590.70754: getting the remaining hosts for this loop 30575 1726867590.70755: done getting the remaining hosts for this loop 30575 1726867590.70759: getting the next task for host managed_node3 30575 1726867590.70770: done getting next task for host managed_node3 30575 1726867590.70772: ^ task is: TASK: Set NM profile exist flag based on the profile files 30575 1726867590.70779: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867590.70784: getting variables 30575 1726867590.70785: in VariableManager get_vars() 30575 1726867590.70826: Calling all_inventory to load vars for managed_node3 30575 1726867590.70829: Calling groups_inventory to load vars for managed_node3 30575 1726867590.70833: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867590.70844: Calling all_plugins_play to load vars for managed_node3 30575 1726867590.70847: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867590.70850: Calling groups_plugins_play to load vars for managed_node3 30575 1726867590.71600: WORKER PROCESS EXITING 30575 1726867590.72207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867590.73184: done with get_vars() 30575 1726867590.73199: done getting variables 30575 1726867590.73245: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:26:30 -0400 (0:00:00.451) 0:00:26.110 ****** 30575 1726867590.73271: entering _queue_task() for managed_node3/set_fact 30575 1726867590.73523: worker is 1 (out of 1 available) 30575 1726867590.73537: exiting _queue_task() for managed_node3/set_fact 30575 1726867590.73550: done queuing things up, now waiting for results queue to drain 30575 1726867590.73552: waiting for pending results... 30575 1726867590.73740: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30575 1726867590.73823: in run() - task 0affcac9-a3a5-e081-a588-000000000948 30575 1726867590.73838: variable 'ansible_search_path' from source: unknown 30575 1726867590.73841: variable 'ansible_search_path' from source: unknown 30575 1726867590.73870: calling self._execute() 30575 1726867590.73944: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.73948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.73956: variable 'omit' from source: magic vars 30575 1726867590.74240: variable 'ansible_distribution_major_version' from source: facts 30575 1726867590.74249: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867590.74340: variable 'profile_stat' from source: set_fact 30575 1726867590.74344: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867590.74347: when evaluation is False, skipping this task 30575 1726867590.74349: _execute() done 30575 1726867590.74353: dumping result to json 30575 1726867590.74358: done dumping result, returning 30575 1726867590.74364: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-e081-a588-000000000948] 30575 1726867590.74369: sending task result for task 0affcac9-a3a5-e081-a588-000000000948 30575 1726867590.74459: done sending task result for task 0affcac9-a3a5-e081-a588-000000000948 30575 1726867590.74462: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867590.74509: no more pending results, returning what we have 30575 1726867590.74513: results queue empty 30575 1726867590.74514: checking for any_errors_fatal 30575 1726867590.74525: done checking for any_errors_fatal 30575 1726867590.74526: checking for max_fail_percentage 30575 1726867590.74528: done checking for max_fail_percentage 30575 1726867590.74529: checking to see if all hosts have failed and the running result is not ok 30575 1726867590.74530: done checking to see if all hosts have failed 30575 1726867590.74530: getting the remaining hosts for this loop 30575 1726867590.74532: done getting the remaining hosts for this loop 30575 1726867590.74536: getting the next task for host managed_node3 30575 1726867590.74543: done getting next task for host managed_node3 30575 1726867590.74546: ^ task is: TASK: Get NM profile info 30575 1726867590.74550: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867590.74554: getting variables 30575 1726867590.74556: in VariableManager get_vars() 30575 1726867590.74594: Calling all_inventory to load vars for managed_node3 30575 1726867590.74596: Calling groups_inventory to load vars for managed_node3 30575 1726867590.74599: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867590.74609: Calling all_plugins_play to load vars for managed_node3 30575 1726867590.74611: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867590.74613: Calling groups_plugins_play to load vars for managed_node3 30575 1726867590.75554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867590.76448: done with get_vars() 30575 1726867590.76462: done getting variables 30575 1726867590.76506: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:26:30 -0400 (0:00:00.032) 0:00:26.142 ****** 30575 1726867590.76529: entering _queue_task() for managed_node3/shell 30575 1726867590.76764: worker is 1 (out of 1 available) 30575 1726867590.76776: exiting _queue_task() for managed_node3/shell 30575 1726867590.76790: done queuing things up, now waiting for results queue to drain 30575 1726867590.76792: waiting for pending results... 30575 1726867590.77092: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30575 1726867590.77155: in run() - task 0affcac9-a3a5-e081-a588-000000000949 30575 1726867590.77175: variable 'ansible_search_path' from source: unknown 30575 1726867590.77284: variable 'ansible_search_path' from source: unknown 30575 1726867590.77288: calling self._execute() 30575 1726867590.77329: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.77342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.77358: variable 'omit' from source: magic vars 30575 1726867590.77728: variable 'ansible_distribution_major_version' from source: facts 30575 1726867590.77745: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867590.77757: variable 'omit' from source: magic vars 30575 1726867590.77821: variable 'omit' from source: magic vars 30575 1726867590.78157: variable 'profile' from source: play vars 30575 1726867590.78161: variable 'interface' from source: play vars 30575 1726867590.78190: variable 'interface' from source: play vars 30575 1726867590.78214: variable 'omit' from source: magic vars 30575 1726867590.78257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867590.78312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867590.78338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867590.78358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867590.78372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867590.78403: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867590.78411: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.78418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.78509: Set connection var ansible_pipelining to False 30575 1726867590.78682: Set connection var ansible_shell_type to sh 30575 1726867590.78685: Set connection var ansible_shell_executable to /bin/sh 30575 1726867590.78687: Set connection var ansible_timeout to 10 30575 1726867590.78688: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867590.78690: Set connection var ansible_connection to ssh 30575 1726867590.78692: variable 'ansible_shell_executable' from source: unknown 30575 1726867590.78694: variable 'ansible_connection' from source: unknown 30575 1726867590.78696: variable 'ansible_module_compression' from source: unknown 30575 1726867590.78698: variable 'ansible_shell_type' from source: unknown 30575 1726867590.78700: variable 'ansible_shell_executable' from source: unknown 30575 1726867590.78701: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867590.78703: variable 'ansible_pipelining' from source: unknown 30575 1726867590.78705: variable 'ansible_timeout' from source: unknown 30575 1726867590.78707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867590.78747: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867590.78760: variable 'omit' from source: magic vars 30575 1726867590.78768: starting attempt loop 30575 1726867590.78773: running the handler 30575 1726867590.78786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867590.78806: _low_level_execute_command(): starting 30575 1726867590.78815: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867590.79501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.79561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867590.79576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.79601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.79671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.81269: stdout chunk (state=3): >>>/root <<< 30575 1726867590.81390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.81394: stdout chunk (state=3): >>><<< 30575 1726867590.81401: stderr chunk (state=3): >>><<< 30575 1726867590.81415: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867590.81429: _low_level_execute_command(): starting 30575 1726867590.81435: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004 `" && echo ansible-tmp-1726867590.8141599-31789-34760307619004="` echo /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004 `" ) && sleep 0' 30575 1726867590.81836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867590.81845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867590.81849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867590.81852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.81905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.81908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.81947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.83831: stdout chunk (state=3): >>>ansible-tmp-1726867590.8141599-31789-34760307619004=/root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004 <<< 30575 1726867590.83937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.83962: stderr chunk (state=3): >>><<< 30575 1726867590.83966: stdout chunk (state=3): >>><<< 30575 1726867590.83982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867590.8141599-31789-34760307619004=/root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867590.84005: variable 'ansible_module_compression' from source: unknown 30575 1726867590.84044: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867590.84078: variable 'ansible_facts' from source: unknown 30575 1726867590.84128: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/AnsiballZ_command.py 30575 1726867590.84217: Sending initial data 30575 1726867590.84221: Sent initial data (155 bytes) 30575 1726867590.84637: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.84641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.84643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.84645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.84688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.84692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.84744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.86267: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867590.86274: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867590.86308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867590.86359: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_6bezgrd /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/AnsiballZ_command.py <<< 30575 1726867590.86362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/AnsiballZ_command.py" <<< 30575 1726867590.86398: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_6bezgrd" to remote "/root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/AnsiballZ_command.py" <<< 30575 1726867590.86928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.86962: stderr chunk (state=3): >>><<< 30575 1726867590.86965: stdout chunk (state=3): >>><<< 30575 1726867590.87009: done transferring module to remote 30575 1726867590.87014: _low_level_execute_command(): starting 30575 1726867590.87019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/ /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/AnsiballZ_command.py && sleep 0' 30575 1726867590.87417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.87420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867590.87425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867590.87433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.87466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867590.87488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.87528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867590.89242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867590.89267: stderr chunk (state=3): >>><<< 30575 1726867590.89271: stdout chunk (state=3): >>><<< 30575 1726867590.89281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867590.89284: _low_level_execute_command(): starting 30575 1726867590.89289: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/AnsiballZ_command.py && sleep 0' 30575 1726867590.89657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.89688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867590.89691: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867590.89695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.89697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867590.89699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867590.89747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867590.89750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867590.89805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867591.06693: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:26:31.048021", "end": "2024-09-20 17:26:31.064532", "delta": "0:00:00.016511", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867591.08293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867591.08326: stderr chunk (state=3): >>><<< 30575 1726867591.08330: stdout chunk (state=3): >>><<< 30575 1726867591.08347: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:26:31.048021", "end": "2024-09-20 17:26:31.064532", "delta": "0:00:00.016511", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867591.08473: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867591.08479: _low_level_execute_command(): starting 30575 1726867591.08482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867590.8141599-31789-34760307619004/ > /dev/null 2>&1 && sleep 0' 30575 1726867591.09005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867591.09020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867591.09040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867591.09102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867591.09176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867591.09198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867591.09221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867591.09307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867591.11483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867591.11486: stdout chunk (state=3): >>><<< 30575 1726867591.11489: stderr chunk (state=3): >>><<< 30575 1726867591.11491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867591.11494: handler run complete 30575 1726867591.11497: Evaluated conditional (False): False 30575 1726867591.11500: attempt loop complete, returning result 30575 1726867591.11502: _execute() done 30575 1726867591.11505: dumping result to json 30575 1726867591.11507: done dumping result, returning 30575 1726867591.11510: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-e081-a588-000000000949] 30575 1726867591.11513: sending task result for task 0affcac9-a3a5-e081-a588-000000000949 30575 1726867591.11586: done sending task result for task 0affcac9-a3a5-e081-a588-000000000949 ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016511", "end": "2024-09-20 17:26:31.064532", "rc": 0, "start": "2024-09-20 17:26:31.048021" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30575 1726867591.11666: no more pending results, returning what we have 30575 1726867591.11671: results queue empty 30575 1726867591.11672: checking for any_errors_fatal 30575 1726867591.11683: done checking for any_errors_fatal 30575 1726867591.11683: checking for max_fail_percentage 30575 1726867591.11685: done checking for max_fail_percentage 30575 1726867591.11686: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.11687: done checking to see if all hosts have failed 30575 1726867591.11688: getting the remaining hosts for this loop 30575 1726867591.11689: done getting the remaining hosts for this loop 30575 1726867591.11693: getting the next task for host managed_node3 30575 1726867591.11702: done getting next task for host managed_node3 30575 1726867591.11704: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867591.11709: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.11712: getting variables 30575 1726867591.11714: in VariableManager get_vars() 30575 1726867591.11749: Calling all_inventory to load vars for managed_node3 30575 1726867591.11751: Calling groups_inventory to load vars for managed_node3 30575 1726867591.11755: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.11765: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.11768: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.11770: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.12298: WORKER PROCESS EXITING 30575 1726867591.15206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.17955: done with get_vars() 30575 1726867591.17988: done getting variables 30575 1726867591.18052: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:26:31 -0400 (0:00:00.415) 0:00:26.558 ****** 30575 1726867591.18094: entering _queue_task() for managed_node3/set_fact 30575 1726867591.18467: worker is 1 (out of 1 available) 30575 1726867591.18621: exiting _queue_task() for managed_node3/set_fact 30575 1726867591.18635: done queuing things up, now waiting for results queue to drain 30575 1726867591.18637: waiting for pending results... 30575 1726867591.18818: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867591.18941: in run() - task 0affcac9-a3a5-e081-a588-00000000094a 30575 1726867591.18959: variable 'ansible_search_path' from source: unknown 30575 1726867591.18963: variable 'ansible_search_path' from source: unknown 30575 1726867591.19005: calling self._execute() 30575 1726867591.19104: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.19107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.19119: variable 'omit' from source: magic vars 30575 1726867591.19590: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.19606: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.19855: variable 'nm_profile_exists' from source: set_fact 30575 1726867591.19866: Evaluated conditional (nm_profile_exists.rc == 0): True 30575 1726867591.19872: variable 'omit' from source: magic vars 30575 1726867591.19975: variable 'omit' from source: magic vars 30575 1726867591.20068: variable 'omit' from source: magic vars 30575 1726867591.20179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867591.20241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867591.20329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867591.20347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.20360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.20502: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867591.20505: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.20508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.20678: Set connection var ansible_pipelining to False 30575 1726867591.20681: Set connection var ansible_shell_type to sh 30575 1726867591.20797: Set connection var ansible_shell_executable to /bin/sh 30575 1726867591.20804: Set connection var ansible_timeout to 10 30575 1726867591.20809: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867591.20817: Set connection var ansible_connection to ssh 30575 1726867591.20845: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.20848: variable 'ansible_connection' from source: unknown 30575 1726867591.20851: variable 'ansible_module_compression' from source: unknown 30575 1726867591.20853: variable 'ansible_shell_type' from source: unknown 30575 1726867591.20855: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.20857: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.20862: variable 'ansible_pipelining' from source: unknown 30575 1726867591.20865: variable 'ansible_timeout' from source: unknown 30575 1726867591.20868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.21174: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867591.21179: variable 'omit' from source: magic vars 30575 1726867591.21182: starting attempt loop 30575 1726867591.21184: running the handler 30575 1726867591.21185: handler run complete 30575 1726867591.21187: attempt loop complete, returning result 30575 1726867591.21189: _execute() done 30575 1726867591.21190: dumping result to json 30575 1726867591.21192: done dumping result, returning 30575 1726867591.21194: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-e081-a588-00000000094a] 30575 1726867591.21196: sending task result for task 0affcac9-a3a5-e081-a588-00000000094a 30575 1726867591.21254: done sending task result for task 0affcac9-a3a5-e081-a588-00000000094a 30575 1726867591.21257: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30575 1726867591.21315: no more pending results, returning what we have 30575 1726867591.21319: results queue empty 30575 1726867591.21320: checking for any_errors_fatal 30575 1726867591.21331: done checking for any_errors_fatal 30575 1726867591.21336: checking for max_fail_percentage 30575 1726867591.21338: done checking for max_fail_percentage 30575 1726867591.21339: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.21341: done checking to see if all hosts have failed 30575 1726867591.21341: getting the remaining hosts for this loop 30575 1726867591.21343: done getting the remaining hosts for this loop 30575 1726867591.21347: getting the next task for host managed_node3 30575 1726867591.21359: done getting next task for host managed_node3 30575 1726867591.21361: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867591.21366: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.21371: getting variables 30575 1726867591.21373: in VariableManager get_vars() 30575 1726867591.21408: Calling all_inventory to load vars for managed_node3 30575 1726867591.21410: Calling groups_inventory to load vars for managed_node3 30575 1726867591.21414: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.21429: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.21432: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.21435: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.23135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.24894: done with get_vars() 30575 1726867591.24914: done getting variables 30575 1726867591.24983: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.25111: variable 'profile' from source: play vars 30575 1726867591.25115: variable 'interface' from source: play vars 30575 1726867591.25182: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:26:31 -0400 (0:00:00.071) 0:00:26.629 ****** 30575 1726867591.25213: entering _queue_task() for managed_node3/command 30575 1726867591.25567: worker is 1 (out of 1 available) 30575 1726867591.25788: exiting _queue_task() for managed_node3/command 30575 1726867591.25799: done queuing things up, now waiting for results queue to drain 30575 1726867591.25800: waiting for pending results... 30575 1726867591.25954: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30575 1726867591.26118: in run() - task 0affcac9-a3a5-e081-a588-00000000094c 30575 1726867591.26122: variable 'ansible_search_path' from source: unknown 30575 1726867591.26125: variable 'ansible_search_path' from source: unknown 30575 1726867591.26134: calling self._execute() 30575 1726867591.26231: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.26282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.26287: variable 'omit' from source: magic vars 30575 1726867591.26643: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.26658: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.26787: variable 'profile_stat' from source: set_fact 30575 1726867591.26798: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867591.26801: when evaluation is False, skipping this task 30575 1726867591.26804: _execute() done 30575 1726867591.26806: dumping result to json 30575 1726867591.26809: done dumping result, returning 30575 1726867591.26876: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000094c] 30575 1726867591.26881: sending task result for task 0affcac9-a3a5-e081-a588-00000000094c 30575 1726867591.26938: done sending task result for task 0affcac9-a3a5-e081-a588-00000000094c 30575 1726867591.26942: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867591.27001: no more pending results, returning what we have 30575 1726867591.27006: results queue empty 30575 1726867591.27006: checking for any_errors_fatal 30575 1726867591.27013: done checking for any_errors_fatal 30575 1726867591.27014: checking for max_fail_percentage 30575 1726867591.27016: done checking for max_fail_percentage 30575 1726867591.27017: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.27018: done checking to see if all hosts have failed 30575 1726867591.27019: getting the remaining hosts for this loop 30575 1726867591.27021: done getting the remaining hosts for this loop 30575 1726867591.27028: getting the next task for host managed_node3 30575 1726867591.27037: done getting next task for host managed_node3 30575 1726867591.27040: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867591.27045: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.27049: getting variables 30575 1726867591.27051: in VariableManager get_vars() 30575 1726867591.27199: Calling all_inventory to load vars for managed_node3 30575 1726867591.27202: Calling groups_inventory to load vars for managed_node3 30575 1726867591.27206: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.27217: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.27220: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.27222: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.30141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.31574: done with get_vars() 30575 1726867591.31592: done getting variables 30575 1726867591.31635: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.31715: variable 'profile' from source: play vars 30575 1726867591.31719: variable 'interface' from source: play vars 30575 1726867591.31766: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:26:31 -0400 (0:00:00.065) 0:00:26.695 ****** 30575 1726867591.31796: entering _queue_task() for managed_node3/set_fact 30575 1726867591.32092: worker is 1 (out of 1 available) 30575 1726867591.32106: exiting _queue_task() for managed_node3/set_fact 30575 1726867591.32118: done queuing things up, now waiting for results queue to drain 30575 1726867591.32120: waiting for pending results... 30575 1726867591.32408: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30575 1726867591.32504: in run() - task 0affcac9-a3a5-e081-a588-00000000094d 30575 1726867591.32508: variable 'ansible_search_path' from source: unknown 30575 1726867591.32511: variable 'ansible_search_path' from source: unknown 30575 1726867591.32514: calling self._execute() 30575 1726867591.32603: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.32608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.32617: variable 'omit' from source: magic vars 30575 1726867591.32960: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.32971: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.33109: variable 'profile_stat' from source: set_fact 30575 1726867591.33113: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867591.33118: when evaluation is False, skipping this task 30575 1726867591.33178: _execute() done 30575 1726867591.33182: dumping result to json 30575 1726867591.33187: done dumping result, returning 30575 1726867591.33189: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000094d] 30575 1726867591.33191: sending task result for task 0affcac9-a3a5-e081-a588-00000000094d 30575 1726867591.33254: done sending task result for task 0affcac9-a3a5-e081-a588-00000000094d 30575 1726867591.33257: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867591.33315: no more pending results, returning what we have 30575 1726867591.33320: results queue empty 30575 1726867591.33321: checking for any_errors_fatal 30575 1726867591.33330: done checking for any_errors_fatal 30575 1726867591.33331: checking for max_fail_percentage 30575 1726867591.33333: done checking for max_fail_percentage 30575 1726867591.33334: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.33335: done checking to see if all hosts have failed 30575 1726867591.33336: getting the remaining hosts for this loop 30575 1726867591.33338: done getting the remaining hosts for this loop 30575 1726867591.33343: getting the next task for host managed_node3 30575 1726867591.33353: done getting next task for host managed_node3 30575 1726867591.33355: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30575 1726867591.33360: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.33364: getting variables 30575 1726867591.33580: in VariableManager get_vars() 30575 1726867591.33610: Calling all_inventory to load vars for managed_node3 30575 1726867591.33613: Calling groups_inventory to load vars for managed_node3 30575 1726867591.33616: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.33626: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.33629: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.33632: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.34994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.35860: done with get_vars() 30575 1726867591.35874: done getting variables 30575 1726867591.35916: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.35993: variable 'profile' from source: play vars 30575 1726867591.35996: variable 'interface' from source: play vars 30575 1726867591.36069: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:26:31 -0400 (0:00:00.043) 0:00:26.738 ****** 30575 1726867591.36113: entering _queue_task() for managed_node3/command 30575 1726867591.36397: worker is 1 (out of 1 available) 30575 1726867591.36411: exiting _queue_task() for managed_node3/command 30575 1726867591.36424: done queuing things up, now waiting for results queue to drain 30575 1726867591.36426: waiting for pending results... 30575 1726867591.36802: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30575 1726867591.36809: in run() - task 0affcac9-a3a5-e081-a588-00000000094e 30575 1726867591.36812: variable 'ansible_search_path' from source: unknown 30575 1726867591.36815: variable 'ansible_search_path' from source: unknown 30575 1726867591.36818: calling self._execute() 30575 1726867591.36935: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.36941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.36950: variable 'omit' from source: magic vars 30575 1726867591.37204: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.37212: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.37298: variable 'profile_stat' from source: set_fact 30575 1726867591.37306: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867591.37309: when evaluation is False, skipping this task 30575 1726867591.37312: _execute() done 30575 1726867591.37316: dumping result to json 30575 1726867591.37319: done dumping result, returning 30575 1726867591.37330: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000094e] 30575 1726867591.37333: sending task result for task 0affcac9-a3a5-e081-a588-00000000094e 30575 1726867591.37415: done sending task result for task 0affcac9-a3a5-e081-a588-00000000094e 30575 1726867591.37417: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867591.37473: no more pending results, returning what we have 30575 1726867591.37476: results queue empty 30575 1726867591.37479: checking for any_errors_fatal 30575 1726867591.37484: done checking for any_errors_fatal 30575 1726867591.37484: checking for max_fail_percentage 30575 1726867591.37486: done checking for max_fail_percentage 30575 1726867591.37487: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.37488: done checking to see if all hosts have failed 30575 1726867591.37488: getting the remaining hosts for this loop 30575 1726867591.37490: done getting the remaining hosts for this loop 30575 1726867591.37493: getting the next task for host managed_node3 30575 1726867591.37500: done getting next task for host managed_node3 30575 1726867591.37503: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30575 1726867591.37507: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.37510: getting variables 30575 1726867591.37511: in VariableManager get_vars() 30575 1726867591.37538: Calling all_inventory to load vars for managed_node3 30575 1726867591.37541: Calling groups_inventory to load vars for managed_node3 30575 1726867591.37544: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.37556: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.37558: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.37561: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.42888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.44701: done with get_vars() 30575 1726867591.44722: done getting variables 30575 1726867591.44762: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.44867: variable 'profile' from source: play vars 30575 1726867591.44872: variable 'interface' from source: play vars 30575 1726867591.44939: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:26:31 -0400 (0:00:00.088) 0:00:26.827 ****** 30575 1726867591.44967: entering _queue_task() for managed_node3/set_fact 30575 1726867591.45644: worker is 1 (out of 1 available) 30575 1726867591.45657: exiting _queue_task() for managed_node3/set_fact 30575 1726867591.45671: done queuing things up, now waiting for results queue to drain 30575 1726867591.45672: waiting for pending results... 30575 1726867591.46121: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30575 1726867591.46354: in run() - task 0affcac9-a3a5-e081-a588-00000000094f 30575 1726867591.46358: variable 'ansible_search_path' from source: unknown 30575 1726867591.46361: variable 'ansible_search_path' from source: unknown 30575 1726867591.46486: calling self._execute() 30575 1726867591.46540: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.46547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.46558: variable 'omit' from source: magic vars 30575 1726867591.46980: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.47061: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.47134: variable 'profile_stat' from source: set_fact 30575 1726867591.47150: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867591.47158: when evaluation is False, skipping this task 30575 1726867591.47169: _execute() done 30575 1726867591.47185: dumping result to json 30575 1726867591.47193: done dumping result, returning 30575 1726867591.47286: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000094f] 30575 1726867591.47289: sending task result for task 0affcac9-a3a5-e081-a588-00000000094f 30575 1726867591.47358: done sending task result for task 0affcac9-a3a5-e081-a588-00000000094f 30575 1726867591.47362: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867591.47432: no more pending results, returning what we have 30575 1726867591.47436: results queue empty 30575 1726867591.47437: checking for any_errors_fatal 30575 1726867591.47445: done checking for any_errors_fatal 30575 1726867591.47446: checking for max_fail_percentage 30575 1726867591.47448: done checking for max_fail_percentage 30575 1726867591.47449: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.47450: done checking to see if all hosts have failed 30575 1726867591.47451: getting the remaining hosts for this loop 30575 1726867591.47452: done getting the remaining hosts for this loop 30575 1726867591.47456: getting the next task for host managed_node3 30575 1726867591.47466: done getting next task for host managed_node3 30575 1726867591.47469: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30575 1726867591.47475: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.47482: getting variables 30575 1726867591.47484: in VariableManager get_vars() 30575 1726867591.47516: Calling all_inventory to load vars for managed_node3 30575 1726867591.47518: Calling groups_inventory to load vars for managed_node3 30575 1726867591.47522: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.47534: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.47537: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.47540: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.49123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.51275: done with get_vars() 30575 1726867591.51296: done getting variables 30575 1726867591.51430: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.51693: variable 'profile' from source: play vars 30575 1726867591.51697: variable 'interface' from source: play vars 30575 1726867591.51765: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:26:31 -0400 (0:00:00.069) 0:00:26.896 ****** 30575 1726867591.51951: entering _queue_task() for managed_node3/assert 30575 1726867591.52682: worker is 1 (out of 1 available) 30575 1726867591.52702: exiting _queue_task() for managed_node3/assert 30575 1726867591.52714: done queuing things up, now waiting for results queue to drain 30575 1726867591.52716: waiting for pending results... 30575 1726867591.53352: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' 30575 1726867591.53429: in run() - task 0affcac9-a3a5-e081-a588-0000000008ae 30575 1726867591.53469: variable 'ansible_search_path' from source: unknown 30575 1726867591.53565: variable 'ansible_search_path' from source: unknown 30575 1726867591.53571: calling self._execute() 30575 1726867591.53811: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.53823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.54000: variable 'omit' from source: magic vars 30575 1726867591.54675: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.54681: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.54684: variable 'omit' from source: magic vars 30575 1726867591.54820: variable 'omit' from source: magic vars 30575 1726867591.55012: variable 'profile' from source: play vars 30575 1726867591.55021: variable 'interface' from source: play vars 30575 1726867591.55081: variable 'interface' from source: play vars 30575 1726867591.55168: variable 'omit' from source: magic vars 30575 1726867591.55350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867591.55353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867591.55375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867591.55397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.55411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.55450: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867591.55458: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.55464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.55597: Set connection var ansible_pipelining to False 30575 1726867591.55608: Set connection var ansible_shell_type to sh 30575 1726867591.55625: Set connection var ansible_shell_executable to /bin/sh 30575 1726867591.55666: Set connection var ansible_timeout to 10 30575 1726867591.55906: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867591.55910: Set connection var ansible_connection to ssh 30575 1726867591.55913: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.55915: variable 'ansible_connection' from source: unknown 30575 1726867591.55918: variable 'ansible_module_compression' from source: unknown 30575 1726867591.55920: variable 'ansible_shell_type' from source: unknown 30575 1726867591.55922: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.55924: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.55927: variable 'ansible_pipelining' from source: unknown 30575 1726867591.55929: variable 'ansible_timeout' from source: unknown 30575 1726867591.55931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.55934: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867591.55970: variable 'omit' from source: magic vars 30575 1726867591.56011: starting attempt loop 30575 1726867591.56019: running the handler 30575 1726867591.56134: variable 'lsr_net_profile_exists' from source: set_fact 30575 1726867591.56226: Evaluated conditional (lsr_net_profile_exists): True 30575 1726867591.56229: handler run complete 30575 1726867591.56231: attempt loop complete, returning result 30575 1726867591.56233: _execute() done 30575 1726867591.56235: dumping result to json 30575 1726867591.56237: done dumping result, returning 30575 1726867591.56239: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' [0affcac9-a3a5-e081-a588-0000000008ae] 30575 1726867591.56240: sending task result for task 0affcac9-a3a5-e081-a588-0000000008ae ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867591.56488: no more pending results, returning what we have 30575 1726867591.56491: results queue empty 30575 1726867591.56492: checking for any_errors_fatal 30575 1726867591.56498: done checking for any_errors_fatal 30575 1726867591.56499: checking for max_fail_percentage 30575 1726867591.56500: done checking for max_fail_percentage 30575 1726867591.56501: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.56502: done checking to see if all hosts have failed 30575 1726867591.56503: getting the remaining hosts for this loop 30575 1726867591.56504: done getting the remaining hosts for this loop 30575 1726867591.56508: getting the next task for host managed_node3 30575 1726867591.56515: done getting next task for host managed_node3 30575 1726867591.56518: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30575 1726867591.56522: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.56526: getting variables 30575 1726867591.56528: in VariableManager get_vars() 30575 1726867591.56559: Calling all_inventory to load vars for managed_node3 30575 1726867591.56561: Calling groups_inventory to load vars for managed_node3 30575 1726867591.56565: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.56575: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.56579: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.56582: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.57191: done sending task result for task 0affcac9-a3a5-e081-a588-0000000008ae 30575 1726867591.57194: WORKER PROCESS EXITING 30575 1726867591.58254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.60417: done with get_vars() 30575 1726867591.60443: done getting variables 30575 1726867591.60509: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.60685: variable 'profile' from source: play vars 30575 1726867591.60689: variable 'interface' from source: play vars 30575 1726867591.60809: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:26:31 -0400 (0:00:00.089) 0:00:26.986 ****** 30575 1726867591.60856: entering _queue_task() for managed_node3/assert 30575 1726867591.61498: worker is 1 (out of 1 available) 30575 1726867591.61559: exiting _queue_task() for managed_node3/assert 30575 1726867591.61571: done queuing things up, now waiting for results queue to drain 30575 1726867591.61573: waiting for pending results... 30575 1726867591.61759: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' 30575 1726867591.61855: in run() - task 0affcac9-a3a5-e081-a588-0000000008af 30575 1726867591.61927: variable 'ansible_search_path' from source: unknown 30575 1726867591.61931: variable 'ansible_search_path' from source: unknown 30575 1726867591.61935: calling self._execute() 30575 1726867591.62059: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.62063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.62066: variable 'omit' from source: magic vars 30575 1726867591.62383: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.62390: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.62398: variable 'omit' from source: magic vars 30575 1726867591.62510: variable 'omit' from source: magic vars 30575 1726867591.62550: variable 'profile' from source: play vars 30575 1726867591.62556: variable 'interface' from source: play vars 30575 1726867591.62631: variable 'interface' from source: play vars 30575 1726867591.62648: variable 'omit' from source: magic vars 30575 1726867591.62693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867591.62729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867591.62800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867591.62807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.62811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.62814: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867591.62817: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.62842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.62942: Set connection var ansible_pipelining to False 30575 1726867591.62946: Set connection var ansible_shell_type to sh 30575 1726867591.63013: Set connection var ansible_shell_executable to /bin/sh 30575 1726867591.63017: Set connection var ansible_timeout to 10 30575 1726867591.63020: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867591.63022: Set connection var ansible_connection to ssh 30575 1726867591.63027: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.63029: variable 'ansible_connection' from source: unknown 30575 1726867591.63032: variable 'ansible_module_compression' from source: unknown 30575 1726867591.63034: variable 'ansible_shell_type' from source: unknown 30575 1726867591.63035: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.63056: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.63060: variable 'ansible_pipelining' from source: unknown 30575 1726867591.63063: variable 'ansible_timeout' from source: unknown 30575 1726867591.63065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.63245: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867591.63248: variable 'omit' from source: magic vars 30575 1726867591.63251: starting attempt loop 30575 1726867591.63253: running the handler 30575 1726867591.63298: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30575 1726867591.63304: Evaluated conditional (lsr_net_profile_ansible_managed): True 30575 1726867591.63310: handler run complete 30575 1726867591.63329: attempt loop complete, returning result 30575 1726867591.63332: _execute() done 30575 1726867591.63335: dumping result to json 30575 1726867591.63338: done dumping result, returning 30575 1726867591.63347: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcac9-a3a5-e081-a588-0000000008af] 30575 1726867591.63349: sending task result for task 0affcac9-a3a5-e081-a588-0000000008af 30575 1726867591.63444: done sending task result for task 0affcac9-a3a5-e081-a588-0000000008af 30575 1726867591.63447: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867591.63534: no more pending results, returning what we have 30575 1726867591.63537: results queue empty 30575 1726867591.63538: checking for any_errors_fatal 30575 1726867591.63544: done checking for any_errors_fatal 30575 1726867591.63545: checking for max_fail_percentage 30575 1726867591.63546: done checking for max_fail_percentage 30575 1726867591.63547: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.63548: done checking to see if all hosts have failed 30575 1726867591.63549: getting the remaining hosts for this loop 30575 1726867591.63550: done getting the remaining hosts for this loop 30575 1726867591.63553: getting the next task for host managed_node3 30575 1726867591.63561: done getting next task for host managed_node3 30575 1726867591.63564: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30575 1726867591.63567: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.63570: getting variables 30575 1726867591.63572: in VariableManager get_vars() 30575 1726867591.63604: Calling all_inventory to load vars for managed_node3 30575 1726867591.63607: Calling groups_inventory to load vars for managed_node3 30575 1726867591.63612: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.63621: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.63626: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.63629: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.65820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.67961: done with get_vars() 30575 1726867591.67994: done getting variables 30575 1726867591.68055: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.68163: variable 'profile' from source: play vars 30575 1726867591.68168: variable 'interface' from source: play vars 30575 1726867591.68230: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:26:31 -0400 (0:00:00.074) 0:00:27.060 ****** 30575 1726867591.68260: entering _queue_task() for managed_node3/assert 30575 1726867591.68576: worker is 1 (out of 1 available) 30575 1726867591.68592: exiting _queue_task() for managed_node3/assert 30575 1726867591.68604: done queuing things up, now waiting for results queue to drain 30575 1726867591.68606: waiting for pending results... 30575 1726867591.68991: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr 30575 1726867591.69002: in run() - task 0affcac9-a3a5-e081-a588-0000000008b0 30575 1726867591.69089: variable 'ansible_search_path' from source: unknown 30575 1726867591.69094: variable 'ansible_search_path' from source: unknown 30575 1726867591.69097: calling self._execute() 30575 1726867591.69166: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.69190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.69195: variable 'omit' from source: magic vars 30575 1726867591.69838: variable 'ansible_distribution_major_version' from source: facts 30575 1726867591.69842: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867591.69845: variable 'omit' from source: magic vars 30575 1726867591.69849: variable 'omit' from source: magic vars 30575 1726867591.69950: variable 'profile' from source: play vars 30575 1726867591.69954: variable 'interface' from source: play vars 30575 1726867591.70033: variable 'interface' from source: play vars 30575 1726867591.70061: variable 'omit' from source: magic vars 30575 1726867591.70148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867591.70190: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867591.70218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867591.70240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.70257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867591.70292: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867591.70312: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.70336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.70472: Set connection var ansible_pipelining to False 30575 1726867591.70526: Set connection var ansible_shell_type to sh 30575 1726867591.70530: Set connection var ansible_shell_executable to /bin/sh 30575 1726867591.70533: Set connection var ansible_timeout to 10 30575 1726867591.70535: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867591.70538: Set connection var ansible_connection to ssh 30575 1726867591.70553: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.70562: variable 'ansible_connection' from source: unknown 30575 1726867591.70569: variable 'ansible_module_compression' from source: unknown 30575 1726867591.70575: variable 'ansible_shell_type' from source: unknown 30575 1726867591.70585: variable 'ansible_shell_executable' from source: unknown 30575 1726867591.70634: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.70637: variable 'ansible_pipelining' from source: unknown 30575 1726867591.70640: variable 'ansible_timeout' from source: unknown 30575 1726867591.70642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.70776: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867591.70796: variable 'omit' from source: magic vars 30575 1726867591.70810: starting attempt loop 30575 1726867591.70830: running the handler 30575 1726867591.70945: variable 'lsr_net_profile_fingerprint' from source: set_fact 30575 1726867591.70984: Evaluated conditional (lsr_net_profile_fingerprint): True 30575 1726867591.70990: handler run complete 30575 1726867591.70999: attempt loop complete, returning result 30575 1726867591.71067: _execute() done 30575 1726867591.71071: dumping result to json 30575 1726867591.71073: done dumping result, returning 30575 1726867591.71075: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr [0affcac9-a3a5-e081-a588-0000000008b0] 30575 1726867591.71078: sending task result for task 0affcac9-a3a5-e081-a588-0000000008b0 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867591.71337: no more pending results, returning what we have 30575 1726867591.71341: results queue empty 30575 1726867591.71342: checking for any_errors_fatal 30575 1726867591.71349: done checking for any_errors_fatal 30575 1726867591.71350: checking for max_fail_percentage 30575 1726867591.71352: done checking for max_fail_percentage 30575 1726867591.71353: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.71354: done checking to see if all hosts have failed 30575 1726867591.71355: getting the remaining hosts for this loop 30575 1726867591.71357: done getting the remaining hosts for this loop 30575 1726867591.71360: getting the next task for host managed_node3 30575 1726867591.71371: done getting next task for host managed_node3 30575 1726867591.71375: ^ task is: TASK: Conditional asserts 30575 1726867591.71379: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.71385: getting variables 30575 1726867591.71387: in VariableManager get_vars() 30575 1726867591.71421: Calling all_inventory to load vars for managed_node3 30575 1726867591.71423: Calling groups_inventory to load vars for managed_node3 30575 1726867591.71427: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.71438: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.71441: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.71444: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.71990: done sending task result for task 0affcac9-a3a5-e081-a588-0000000008b0 30575 1726867591.71994: WORKER PROCESS EXITING 30575 1726867591.73238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.76951: done with get_vars() 30575 1726867591.76996: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:26:31 -0400 (0:00:00.090) 0:00:27.150 ****** 30575 1726867591.77292: entering _queue_task() for managed_node3/include_tasks 30575 1726867591.77910: worker is 1 (out of 1 available) 30575 1726867591.77929: exiting _queue_task() for managed_node3/include_tasks 30575 1726867591.77942: done queuing things up, now waiting for results queue to drain 30575 1726867591.77944: waiting for pending results... 30575 1726867591.78556: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30575 1726867591.78654: in run() - task 0affcac9-a3a5-e081-a588-0000000005ba 30575 1726867591.78669: variable 'ansible_search_path' from source: unknown 30575 1726867591.78673: variable 'ansible_search_path' from source: unknown 30575 1726867591.79044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867591.84484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867591.84488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867591.84516: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867591.84550: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867591.84580: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867591.84792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867591.84796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867591.84852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867591.84950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867591.84964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867591.85112: dumping result to json 30575 1726867591.85115: done dumping result, returning 30575 1726867591.85122: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-e081-a588-0000000005ba] 30575 1726867591.85127: sending task result for task 0affcac9-a3a5-e081-a588-0000000005ba 30575 1726867591.85784: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005ba 30575 1726867591.85787: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 30575 1726867591.85834: no more pending results, returning what we have 30575 1726867591.85838: results queue empty 30575 1726867591.85839: checking for any_errors_fatal 30575 1726867591.85844: done checking for any_errors_fatal 30575 1726867591.85845: checking for max_fail_percentage 30575 1726867591.85847: done checking for max_fail_percentage 30575 1726867591.85847: checking to see if all hosts have failed and the running result is not ok 30575 1726867591.85848: done checking to see if all hosts have failed 30575 1726867591.85849: getting the remaining hosts for this loop 30575 1726867591.85850: done getting the remaining hosts for this loop 30575 1726867591.85854: getting the next task for host managed_node3 30575 1726867591.85860: done getting next task for host managed_node3 30575 1726867591.85863: ^ task is: TASK: Success in test '{{ lsr_description }}' 30575 1726867591.85866: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867591.85869: getting variables 30575 1726867591.85871: in VariableManager get_vars() 30575 1726867591.85906: Calling all_inventory to load vars for managed_node3 30575 1726867591.85909: Calling groups_inventory to load vars for managed_node3 30575 1726867591.85912: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867591.85921: Calling all_plugins_play to load vars for managed_node3 30575 1726867591.85926: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867591.85929: Calling groups_plugins_play to load vars for managed_node3 30575 1726867591.90338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867591.95757: done with get_vars() 30575 1726867591.95791: done getting variables 30575 1726867591.96190: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867591.96427: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:26:31 -0400 (0:00:00.191) 0:00:27.342 ****** 30575 1726867591.96459: entering _queue_task() for managed_node3/debug 30575 1726867591.97625: worker is 1 (out of 1 available) 30575 1726867591.97636: exiting _queue_task() for managed_node3/debug 30575 1726867591.97648: done queuing things up, now waiting for results queue to drain 30575 1726867591.97650: waiting for pending results... 30575 1726867591.98475: running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile without autoconnect' 30575 1726867591.99084: in run() - task 0affcac9-a3a5-e081-a588-0000000005bb 30575 1726867591.99089: variable 'ansible_search_path' from source: unknown 30575 1726867591.99092: variable 'ansible_search_path' from source: unknown 30575 1726867591.99095: calling self._execute() 30575 1726867591.99157: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867591.99786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867591.99790: variable 'omit' from source: magic vars 30575 1726867592.00886: variable 'ansible_distribution_major_version' from source: facts 30575 1726867592.00890: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867592.00892: variable 'omit' from source: magic vars 30575 1726867592.00894: variable 'omit' from source: magic vars 30575 1726867592.00957: variable 'lsr_description' from source: include params 30575 1726867592.01382: variable 'omit' from source: magic vars 30575 1726867592.01385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867592.01387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867592.01389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867592.01392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867592.01394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867592.01396: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867592.01397: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867592.01399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867592.01768: Set connection var ansible_pipelining to False 30575 1726867592.01988: Set connection var ansible_shell_type to sh 30575 1726867592.01999: Set connection var ansible_shell_executable to /bin/sh 30575 1726867592.02008: Set connection var ansible_timeout to 10 30575 1726867592.02017: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867592.02032: Set connection var ansible_connection to ssh 30575 1726867592.02065: variable 'ansible_shell_executable' from source: unknown 30575 1726867592.02076: variable 'ansible_connection' from source: unknown 30575 1726867592.02187: variable 'ansible_module_compression' from source: unknown 30575 1726867592.02196: variable 'ansible_shell_type' from source: unknown 30575 1726867592.02204: variable 'ansible_shell_executable' from source: unknown 30575 1726867592.02211: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867592.02219: variable 'ansible_pipelining' from source: unknown 30575 1726867592.02229: variable 'ansible_timeout' from source: unknown 30575 1726867592.02237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867592.02381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867592.02883: variable 'omit' from source: magic vars 30575 1726867592.02886: starting attempt loop 30575 1726867592.02889: running the handler 30575 1726867592.02891: handler run complete 30575 1726867592.02893: attempt loop complete, returning result 30575 1726867592.02895: _execute() done 30575 1726867592.02897: dumping result to json 30575 1726867592.02899: done dumping result, returning 30575 1726867592.03182: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile without autoconnect' [0affcac9-a3a5-e081-a588-0000000005bb] 30575 1726867592.03185: sending task result for task 0affcac9-a3a5-e081-a588-0000000005bb 30575 1726867592.03254: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005bb 30575 1726867592.03257: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 30575 1726867592.03316: no more pending results, returning what we have 30575 1726867592.03320: results queue empty 30575 1726867592.03321: checking for any_errors_fatal 30575 1726867592.03388: done checking for any_errors_fatal 30575 1726867592.03389: checking for max_fail_percentage 30575 1726867592.03391: done checking for max_fail_percentage 30575 1726867592.03392: checking to see if all hosts have failed and the running result is not ok 30575 1726867592.03393: done checking to see if all hosts have failed 30575 1726867592.03394: getting the remaining hosts for this loop 30575 1726867592.03395: done getting the remaining hosts for this loop 30575 1726867592.03400: getting the next task for host managed_node3 30575 1726867592.03409: done getting next task for host managed_node3 30575 1726867592.03413: ^ task is: TASK: Cleanup 30575 1726867592.03415: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867592.03421: getting variables 30575 1726867592.03423: in VariableManager get_vars() 30575 1726867592.03703: Calling all_inventory to load vars for managed_node3 30575 1726867592.03705: Calling groups_inventory to load vars for managed_node3 30575 1726867592.03709: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867592.03719: Calling all_plugins_play to load vars for managed_node3 30575 1726867592.03723: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867592.03726: Calling groups_plugins_play to load vars for managed_node3 30575 1726867592.07261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867592.11768: done with get_vars() 30575 1726867592.11801: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:26:32 -0400 (0:00:00.154) 0:00:27.496 ****** 30575 1726867592.11908: entering _queue_task() for managed_node3/include_tasks 30575 1726867592.13106: worker is 1 (out of 1 available) 30575 1726867592.13343: exiting _queue_task() for managed_node3/include_tasks 30575 1726867592.13356: done queuing things up, now waiting for results queue to drain 30575 1726867592.13358: waiting for pending results... 30575 1726867592.14196: running TaskExecutor() for managed_node3/TASK: Cleanup 30575 1726867592.14201: in run() - task 0affcac9-a3a5-e081-a588-0000000005bf 30575 1726867592.14585: variable 'ansible_search_path' from source: unknown 30575 1726867592.14588: variable 'ansible_search_path' from source: unknown 30575 1726867592.14591: variable 'lsr_cleanup' from source: include params 30575 1726867592.15158: variable 'lsr_cleanup' from source: include params 30575 1726867592.15356: variable 'omit' from source: magic vars 30575 1726867592.15702: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867592.15896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867592.15913: variable 'omit' from source: magic vars 30575 1726867592.16259: variable 'ansible_distribution_major_version' from source: facts 30575 1726867592.16593: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867592.16605: variable 'item' from source: unknown 30575 1726867592.16672: variable 'item' from source: unknown 30575 1726867592.16911: variable 'item' from source: unknown 30575 1726867592.16975: variable 'item' from source: unknown 30575 1726867592.17482: dumping result to json 30575 1726867592.17486: done dumping result, returning 30575 1726867592.17489: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-e081-a588-0000000005bf] 30575 1726867592.17492: sending task result for task 0affcac9-a3a5-e081-a588-0000000005bf 30575 1726867592.17540: done sending task result for task 0affcac9-a3a5-e081-a588-0000000005bf 30575 1726867592.17543: WORKER PROCESS EXITING 30575 1726867592.17567: no more pending results, returning what we have 30575 1726867592.17572: in VariableManager get_vars() 30575 1726867592.17614: Calling all_inventory to load vars for managed_node3 30575 1726867592.17616: Calling groups_inventory to load vars for managed_node3 30575 1726867592.17620: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867592.17634: Calling all_plugins_play to load vars for managed_node3 30575 1726867592.17638: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867592.17647: Calling groups_plugins_play to load vars for managed_node3 30575 1726867592.22213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867592.26838: done with get_vars() 30575 1726867592.26983: variable 'ansible_search_path' from source: unknown 30575 1726867592.26985: variable 'ansible_search_path' from source: unknown 30575 1726867592.27024: we have included files to process 30575 1726867592.27026: generating all_blocks data 30575 1726867592.27028: done generating all_blocks data 30575 1726867592.27033: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867592.27034: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867592.27036: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867592.27470: done processing included file 30575 1726867592.27472: iterating over new_blocks loaded from include file 30575 1726867592.27474: in VariableManager get_vars() 30575 1726867592.27639: done with get_vars() 30575 1726867592.27641: filtering new block on tags 30575 1726867592.27667: done filtering new block on tags 30575 1726867592.27671: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30575 1726867592.27676: extending task lists for all hosts with included blocks 30575 1726867592.31206: done extending task lists 30575 1726867592.31209: done processing included files 30575 1726867592.31210: results queue empty 30575 1726867592.31211: checking for any_errors_fatal 30575 1726867592.31214: done checking for any_errors_fatal 30575 1726867592.31215: checking for max_fail_percentage 30575 1726867592.31216: done checking for max_fail_percentage 30575 1726867592.31217: checking to see if all hosts have failed and the running result is not ok 30575 1726867592.31218: done checking to see if all hosts have failed 30575 1726867592.31219: getting the remaining hosts for this loop 30575 1726867592.31221: done getting the remaining hosts for this loop 30575 1726867592.31224: getting the next task for host managed_node3 30575 1726867592.31229: done getting next task for host managed_node3 30575 1726867592.31231: ^ task is: TASK: Cleanup profile and device 30575 1726867592.31234: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867592.31237: getting variables 30575 1726867592.31238: in VariableManager get_vars() 30575 1726867592.31251: Calling all_inventory to load vars for managed_node3 30575 1726867592.31253: Calling groups_inventory to load vars for managed_node3 30575 1726867592.31256: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867592.31262: Calling all_plugins_play to load vars for managed_node3 30575 1726867592.31264: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867592.31267: Calling groups_plugins_play to load vars for managed_node3 30575 1726867592.34437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867592.38399: done with get_vars() 30575 1726867592.38422: done getting variables 30575 1726867592.38464: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 17:26:32 -0400 (0:00:00.267) 0:00:27.763 ****** 30575 1726867592.38616: entering _queue_task() for managed_node3/shell 30575 1726867592.39391: worker is 1 (out of 1 available) 30575 1726867592.39404: exiting _queue_task() for managed_node3/shell 30575 1726867592.39416: done queuing things up, now waiting for results queue to drain 30575 1726867592.39418: waiting for pending results... 30575 1726867592.40197: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30575 1726867592.40206: in run() - task 0affcac9-a3a5-e081-a588-0000000009a0 30575 1726867592.40210: variable 'ansible_search_path' from source: unknown 30575 1726867592.40213: variable 'ansible_search_path' from source: unknown 30575 1726867592.40485: calling self._execute() 30575 1726867592.40510: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867592.40517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867592.40529: variable 'omit' from source: magic vars 30575 1726867592.41457: variable 'ansible_distribution_major_version' from source: facts 30575 1726867592.41483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867592.41487: variable 'omit' from source: magic vars 30575 1726867592.41521: variable 'omit' from source: magic vars 30575 1726867592.41930: variable 'interface' from source: play vars 30575 1726867592.42084: variable 'omit' from source: magic vars 30575 1726867592.42110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867592.42147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867592.42167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867592.42292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867592.42411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867592.42414: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867592.42417: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867592.42419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867592.42559: Set connection var ansible_pipelining to False 30575 1726867592.42562: Set connection var ansible_shell_type to sh 30575 1726867592.42568: Set connection var ansible_shell_executable to /bin/sh 30575 1726867592.42575: Set connection var ansible_timeout to 10 30575 1726867592.42629: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867592.42638: Set connection var ansible_connection to ssh 30575 1726867592.42663: variable 'ansible_shell_executable' from source: unknown 30575 1726867592.42667: variable 'ansible_connection' from source: unknown 30575 1726867592.42670: variable 'ansible_module_compression' from source: unknown 30575 1726867592.42672: variable 'ansible_shell_type' from source: unknown 30575 1726867592.42675: variable 'ansible_shell_executable' from source: unknown 30575 1726867592.42680: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867592.42683: variable 'ansible_pipelining' from source: unknown 30575 1726867592.42685: variable 'ansible_timeout' from source: unknown 30575 1726867592.42688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867592.43039: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867592.43060: variable 'omit' from source: magic vars 30575 1726867592.43065: starting attempt loop 30575 1726867592.43069: running the handler 30575 1726867592.43072: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867592.43170: _low_level_execute_command(): starting 30575 1726867592.43173: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867592.44086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867592.44093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867592.44263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867592.46183: stdout chunk (state=3): >>>/root <<< 30575 1726867592.46347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867592.46351: stdout chunk (state=3): >>><<< 30575 1726867592.46354: stderr chunk (state=3): >>><<< 30575 1726867592.46357: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867592.46361: _low_level_execute_command(): starting 30575 1726867592.46366: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232 `" && echo ansible-tmp-1726867592.4633586-31866-186716114771232="` echo /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232 `" ) && sleep 0' 30575 1726867592.47505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867592.47519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867592.47535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867592.47560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867592.47609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867592.47774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867592.47796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867592.47809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867592.47957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867592.49874: stdout chunk (state=3): >>>ansible-tmp-1726867592.4633586-31866-186716114771232=/root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232 <<< 30575 1726867592.50042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867592.50045: stdout chunk (state=3): >>><<< 30575 1726867592.50054: stderr chunk (state=3): >>><<< 30575 1726867592.50211: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867592.4633586-31866-186716114771232=/root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867592.50216: variable 'ansible_module_compression' from source: unknown 30575 1726867592.50289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867592.50355: variable 'ansible_facts' from source: unknown 30575 1726867592.50584: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/AnsiballZ_command.py 30575 1726867592.50806: Sending initial data 30575 1726867592.50816: Sent initial data (156 bytes) 30575 1726867592.51961: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867592.52115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867592.52136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867592.52201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867592.53758: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867592.53819: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867592.53846: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp8kb5n69k /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/AnsiballZ_command.py <<< 30575 1726867592.53917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/AnsiballZ_command.py" <<< 30575 1726867592.54039: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp8kb5n69k" to remote "/root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/AnsiballZ_command.py" <<< 30575 1726867592.55332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867592.55404: stderr chunk (state=3): >>><<< 30575 1726867592.55483: stdout chunk (state=3): >>><<< 30575 1726867592.55487: done transferring module to remote 30575 1726867592.55588: _low_level_execute_command(): starting 30575 1726867592.55591: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/ /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/AnsiballZ_command.py && sleep 0' 30575 1726867592.56785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867592.56790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867592.56802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867592.56861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867592.56976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867592.57036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867592.58784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867592.58815: stderr chunk (state=3): >>><<< 30575 1726867592.59088: stdout chunk (state=3): >>><<< 30575 1726867592.59091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867592.59095: _low_level_execute_command(): starting 30575 1726867592.59097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/AnsiballZ_command.py && sleep 0' 30575 1726867592.60189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867592.60429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867592.60479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867592.60518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867592.60663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867592.79491: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (4a22b8e7-8099-4ce9-82e9-2718d4e0ef58) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:26:32.760294", "end": "2024-09-20 17:26:32.792507", "delta": "0:00:00.032213", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867592.81127: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867592.81131: stdout chunk (state=3): >>><<< 30575 1726867592.81145: stderr chunk (state=3): >>><<< 30575 1726867592.81234: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (4a22b8e7-8099-4ce9-82e9-2718d4e0ef58) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:26:32.760294", "end": "2024-09-20 17:26:32.792507", "delta": "0:00:00.032213", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867592.81273: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867592.81283: _low_level_execute_command(): starting 30575 1726867592.81289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867592.4633586-31866-186716114771232/ > /dev/null 2>&1 && sleep 0' 30575 1726867592.82183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867592.82688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867592.82713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867592.82784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867592.84621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867592.84625: stdout chunk (state=3): >>><<< 30575 1726867592.84635: stderr chunk (state=3): >>><<< 30575 1726867592.84651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867592.84658: handler run complete 30575 1726867592.84682: Evaluated conditional (False): False 30575 1726867592.84691: attempt loop complete, returning result 30575 1726867592.84699: _execute() done 30575 1726867592.84702: dumping result to json 30575 1726867592.84704: done dumping result, returning 30575 1726867592.84710: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcac9-a3a5-e081-a588-0000000009a0] 30575 1726867592.84715: sending task result for task 0affcac9-a3a5-e081-a588-0000000009a0 fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.032213", "end": "2024-09-20 17:26:32.792507", "rc": 1, "start": "2024-09-20 17:26:32.760294" } STDOUT: Connection 'statebr' (4a22b8e7-8099-4ce9-82e9-2718d4e0ef58) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30575 1726867592.84962: no more pending results, returning what we have 30575 1726867592.84967: results queue empty 30575 1726867592.84967: checking for any_errors_fatal 30575 1726867592.84968: done checking for any_errors_fatal 30575 1726867592.84969: checking for max_fail_percentage 30575 1726867592.84971: done checking for max_fail_percentage 30575 1726867592.84972: checking to see if all hosts have failed and the running result is not ok 30575 1726867592.84973: done checking to see if all hosts have failed 30575 1726867592.84973: getting the remaining hosts for this loop 30575 1726867592.84975: done getting the remaining hosts for this loop 30575 1726867592.84981: getting the next task for host managed_node3 30575 1726867592.84992: done getting next task for host managed_node3 30575 1726867592.84995: ^ task is: TASK: Include the task 'run_test.yml' 30575 1726867592.84997: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867592.85002: getting variables 30575 1726867592.85004: in VariableManager get_vars() 30575 1726867592.85039: Calling all_inventory to load vars for managed_node3 30575 1726867592.85042: Calling groups_inventory to load vars for managed_node3 30575 1726867592.85045: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867592.85057: Calling all_plugins_play to load vars for managed_node3 30575 1726867592.85060: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867592.85063: Calling groups_plugins_play to load vars for managed_node3 30575 1726867592.85885: done sending task result for task 0affcac9-a3a5-e081-a588-0000000009a0 30575 1726867592.85889: WORKER PROCESS EXITING 30575 1726867592.88330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867592.91256: done with get_vars() 30575 1726867592.91284: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Friday 20 September 2024 17:26:32 -0400 (0:00:00.527) 0:00:28.291 ****** 30575 1726867592.91376: entering _queue_task() for managed_node3/include_tasks 30575 1726867592.92323: worker is 1 (out of 1 available) 30575 1726867592.92335: exiting _queue_task() for managed_node3/include_tasks 30575 1726867592.92348: done queuing things up, now waiting for results queue to drain 30575 1726867592.92350: waiting for pending results... 30575 1726867592.92749: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30575 1726867592.92932: in run() - task 0affcac9-a3a5-e081-a588-000000000011 30575 1726867592.92954: variable 'ansible_search_path' from source: unknown 30575 1726867592.93221: calling self._execute() 30575 1726867592.93322: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867592.93339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867592.93355: variable 'omit' from source: magic vars 30575 1726867592.94049: variable 'ansible_distribution_major_version' from source: facts 30575 1726867592.94069: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867592.94082: _execute() done 30575 1726867592.94093: dumping result to json 30575 1726867592.94106: done dumping result, returning 30575 1726867592.94282: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-e081-a588-000000000011] 30575 1726867592.94285: sending task result for task 0affcac9-a3a5-e081-a588-000000000011 30575 1726867592.94583: done sending task result for task 0affcac9-a3a5-e081-a588-000000000011 30575 1726867592.94586: WORKER PROCESS EXITING 30575 1726867592.94614: no more pending results, returning what we have 30575 1726867592.94620: in VariableManager get_vars() 30575 1726867592.94660: Calling all_inventory to load vars for managed_node3 30575 1726867592.94663: Calling groups_inventory to load vars for managed_node3 30575 1726867592.94667: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867592.94684: Calling all_plugins_play to load vars for managed_node3 30575 1726867592.94687: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867592.94690: Calling groups_plugins_play to load vars for managed_node3 30575 1726867592.97805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.00664: done with get_vars() 30575 1726867593.00689: variable 'ansible_search_path' from source: unknown 30575 1726867593.00703: we have included files to process 30575 1726867593.00704: generating all_blocks data 30575 1726867593.00706: done generating all_blocks data 30575 1726867593.00711: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867593.00712: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867593.00714: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867593.01527: in VariableManager get_vars() 30575 1726867593.01547: done with get_vars() 30575 1726867593.01790: in VariableManager get_vars() 30575 1726867593.01808: done with get_vars() 30575 1726867593.01848: in VariableManager get_vars() 30575 1726867593.01864: done with get_vars() 30575 1726867593.01904: in VariableManager get_vars() 30575 1726867593.01920: done with get_vars() 30575 1726867593.01959: in VariableManager get_vars() 30575 1726867593.01975: done with get_vars() 30575 1726867593.02668: in VariableManager get_vars() 30575 1726867593.02987: done with get_vars() 30575 1726867593.03001: done processing included file 30575 1726867593.03003: iterating over new_blocks loaded from include file 30575 1726867593.03004: in VariableManager get_vars() 30575 1726867593.03015: done with get_vars() 30575 1726867593.03017: filtering new block on tags 30575 1726867593.03117: done filtering new block on tags 30575 1726867593.03120: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30575 1726867593.03125: extending task lists for all hosts with included blocks 30575 1726867593.03161: done extending task lists 30575 1726867593.03162: done processing included files 30575 1726867593.03163: results queue empty 30575 1726867593.03163: checking for any_errors_fatal 30575 1726867593.03167: done checking for any_errors_fatal 30575 1726867593.03168: checking for max_fail_percentage 30575 1726867593.03169: done checking for max_fail_percentage 30575 1726867593.03170: checking to see if all hosts have failed and the running result is not ok 30575 1726867593.03171: done checking to see if all hosts have failed 30575 1726867593.03171: getting the remaining hosts for this loop 30575 1726867593.03172: done getting the remaining hosts for this loop 30575 1726867593.03175: getting the next task for host managed_node3 30575 1726867593.03283: done getting next task for host managed_node3 30575 1726867593.03286: ^ task is: TASK: TEST: {{ lsr_description }} 30575 1726867593.03289: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867593.03291: getting variables 30575 1726867593.03292: in VariableManager get_vars() 30575 1726867593.03301: Calling all_inventory to load vars for managed_node3 30575 1726867593.03303: Calling groups_inventory to load vars for managed_node3 30575 1726867593.03305: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867593.03310: Calling all_plugins_play to load vars for managed_node3 30575 1726867593.03312: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867593.03315: Calling groups_plugins_play to load vars for managed_node3 30575 1726867593.05719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.09085: done with get_vars() 30575 1726867593.09107: done getting variables 30575 1726867593.09151: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867593.09375: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:26:33 -0400 (0:00:00.181) 0:00:28.472 ****** 30575 1726867593.09494: entering _queue_task() for managed_node3/debug 30575 1726867593.10292: worker is 1 (out of 1 available) 30575 1726867593.10306: exiting _queue_task() for managed_node3/debug 30575 1726867593.10318: done queuing things up, now waiting for results queue to drain 30575 1726867593.10320: waiting for pending results... 30575 1726867593.10932: running TaskExecutor() for managed_node3/TASK: TEST: I can activate an existing profile 30575 1726867593.11044: in run() - task 0affcac9-a3a5-e081-a588-000000000a49 30575 1726867593.11074: variable 'ansible_search_path' from source: unknown 30575 1726867593.11144: variable 'ansible_search_path' from source: unknown 30575 1726867593.11187: calling self._execute() 30575 1726867593.11340: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.11485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.11493: variable 'omit' from source: magic vars 30575 1726867593.12175: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.12441: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.12445: variable 'omit' from source: magic vars 30575 1726867593.12447: variable 'omit' from source: magic vars 30575 1726867593.12534: variable 'lsr_description' from source: include params 30575 1726867593.12766: variable 'omit' from source: magic vars 30575 1726867593.12769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867593.12772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.12893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867593.12916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.12934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.12971: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.13286: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.13289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.13292: Set connection var ansible_pipelining to False 30575 1726867593.13294: Set connection var ansible_shell_type to sh 30575 1726867593.13296: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.13298: Set connection var ansible_timeout to 10 30575 1726867593.13299: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.13301: Set connection var ansible_connection to ssh 30575 1726867593.13303: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.13305: variable 'ansible_connection' from source: unknown 30575 1726867593.13307: variable 'ansible_module_compression' from source: unknown 30575 1726867593.13309: variable 'ansible_shell_type' from source: unknown 30575 1726867593.13311: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.13313: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.13314: variable 'ansible_pipelining' from source: unknown 30575 1726867593.13316: variable 'ansible_timeout' from source: unknown 30575 1726867593.13318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.13717: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.13737: variable 'omit' from source: magic vars 30575 1726867593.13843: starting attempt loop 30575 1726867593.13885: running the handler 30575 1726867593.13899: handler run complete 30575 1726867593.13913: attempt loop complete, returning result 30575 1726867593.13916: _execute() done 30575 1726867593.13919: dumping result to json 30575 1726867593.13921: done dumping result, returning 30575 1726867593.13928: done running TaskExecutor() for managed_node3/TASK: TEST: I can activate an existing profile [0affcac9-a3a5-e081-a588-000000000a49] 30575 1726867593.13934: sending task result for task 0affcac9-a3a5-e081-a588-000000000a49 30575 1726867593.14248: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a49 30575 1726867593.14251: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can activate an existing profile ########## 30575 1726867593.14310: no more pending results, returning what we have 30575 1726867593.14314: results queue empty 30575 1726867593.14315: checking for any_errors_fatal 30575 1726867593.14317: done checking for any_errors_fatal 30575 1726867593.14317: checking for max_fail_percentage 30575 1726867593.14319: done checking for max_fail_percentage 30575 1726867593.14320: checking to see if all hosts have failed and the running result is not ok 30575 1726867593.14322: done checking to see if all hosts have failed 30575 1726867593.14322: getting the remaining hosts for this loop 30575 1726867593.14324: done getting the remaining hosts for this loop 30575 1726867593.14328: getting the next task for host managed_node3 30575 1726867593.14336: done getting next task for host managed_node3 30575 1726867593.14339: ^ task is: TASK: Show item 30575 1726867593.14342: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867593.14346: getting variables 30575 1726867593.14348: in VariableManager get_vars() 30575 1726867593.14383: Calling all_inventory to load vars for managed_node3 30575 1726867593.14385: Calling groups_inventory to load vars for managed_node3 30575 1726867593.14392: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867593.14404: Calling all_plugins_play to load vars for managed_node3 30575 1726867593.14408: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867593.14411: Calling groups_plugins_play to load vars for managed_node3 30575 1726867593.18206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.21642: done with get_vars() 30575 1726867593.21673: done getting variables 30575 1726867593.22137: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:26:33 -0400 (0:00:00.126) 0:00:28.599 ****** 30575 1726867593.22167: entering _queue_task() for managed_node3/debug 30575 1726867593.23317: worker is 1 (out of 1 available) 30575 1726867593.23329: exiting _queue_task() for managed_node3/debug 30575 1726867593.23341: done queuing things up, now waiting for results queue to drain 30575 1726867593.23342: waiting for pending results... 30575 1726867593.23645: running TaskExecutor() for managed_node3/TASK: Show item 30575 1726867593.23851: in run() - task 0affcac9-a3a5-e081-a588-000000000a4a 30575 1726867593.23959: variable 'ansible_search_path' from source: unknown 30575 1726867593.23964: variable 'ansible_search_path' from source: unknown 30575 1726867593.24001: variable 'omit' from source: magic vars 30575 1726867593.24380: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.24405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.24421: variable 'omit' from source: magic vars 30575 1726867593.25881: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.25885: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.25888: variable 'omit' from source: magic vars 30575 1726867593.25890: variable 'omit' from source: magic vars 30575 1726867593.26013: variable 'item' from source: unknown 30575 1726867593.26208: variable 'item' from source: unknown 30575 1726867593.26235: variable 'omit' from source: magic vars 30575 1726867593.26356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867593.26569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.26596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867593.26663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.26766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.26802: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.26867: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.27188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.27192: Set connection var ansible_pipelining to False 30575 1726867593.27254: Set connection var ansible_shell_type to sh 30575 1726867593.27270: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.27415: Set connection var ansible_timeout to 10 30575 1726867593.27431: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.27444: Set connection var ansible_connection to ssh 30575 1726867593.27475: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.27529: variable 'ansible_connection' from source: unknown 30575 1726867593.27565: variable 'ansible_module_compression' from source: unknown 30575 1726867593.27572: variable 'ansible_shell_type' from source: unknown 30575 1726867593.27584: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.27591: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.27599: variable 'ansible_pipelining' from source: unknown 30575 1726867593.27605: variable 'ansible_timeout' from source: unknown 30575 1726867593.27612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.27943: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.27972: variable 'omit' from source: magic vars 30575 1726867593.28179: starting attempt loop 30575 1726867593.28183: running the handler 30575 1726867593.28186: variable 'lsr_description' from source: include params 30575 1726867593.28314: variable 'lsr_description' from source: include params 30575 1726867593.28331: handler run complete 30575 1726867593.28353: attempt loop complete, returning result 30575 1726867593.28406: variable 'item' from source: unknown 30575 1726867593.28500: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 30575 1726867593.29154: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.29158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.29161: variable 'omit' from source: magic vars 30575 1726867593.29249: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.29481: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.29484: variable 'omit' from source: magic vars 30575 1726867593.29487: variable 'omit' from source: magic vars 30575 1726867593.29489: variable 'item' from source: unknown 30575 1726867593.29607: variable 'item' from source: unknown 30575 1726867593.29630: variable 'omit' from source: magic vars 30575 1726867593.29655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.29704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.29716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.29735: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.29808: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.29816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.30182: Set connection var ansible_pipelining to False 30575 1726867593.30299: Set connection var ansible_shell_type to sh 30575 1726867593.30303: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.30305: Set connection var ansible_timeout to 10 30575 1726867593.30307: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.30309: Set connection var ansible_connection to ssh 30575 1726867593.30311: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.30313: variable 'ansible_connection' from source: unknown 30575 1726867593.30315: variable 'ansible_module_compression' from source: unknown 30575 1726867593.30317: variable 'ansible_shell_type' from source: unknown 30575 1726867593.30319: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.30321: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.30326: variable 'ansible_pipelining' from source: unknown 30575 1726867593.30328: variable 'ansible_timeout' from source: unknown 30575 1726867593.30330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.30627: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.30693: variable 'omit' from source: magic vars 30575 1726867593.30702: starting attempt loop 30575 1726867593.30711: running the handler 30575 1726867593.30743: variable 'lsr_setup' from source: include params 30575 1726867593.30817: variable 'lsr_setup' from source: include params 30575 1726867593.30992: handler run complete 30575 1726867593.31009: attempt loop complete, returning result 30575 1726867593.31032: variable 'item' from source: unknown 30575 1726867593.31284: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 30575 1726867593.31609: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.31613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.31615: variable 'omit' from source: magic vars 30575 1726867593.31860: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.31883: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.31886: variable 'omit' from source: magic vars 30575 1726867593.31898: variable 'omit' from source: magic vars 30575 1726867593.31945: variable 'item' from source: unknown 30575 1726867593.32151: variable 'item' from source: unknown 30575 1726867593.32154: variable 'omit' from source: magic vars 30575 1726867593.32476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.32481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.32483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.32485: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.32487: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.32489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.32589: Set connection var ansible_pipelining to False 30575 1726867593.32597: Set connection var ansible_shell_type to sh 30575 1726867593.32694: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.32704: Set connection var ansible_timeout to 10 30575 1726867593.32713: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.32726: Set connection var ansible_connection to ssh 30575 1726867593.32748: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.33016: variable 'ansible_connection' from source: unknown 30575 1726867593.33019: variable 'ansible_module_compression' from source: unknown 30575 1726867593.33022: variable 'ansible_shell_type' from source: unknown 30575 1726867593.33027: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.33029: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.33031: variable 'ansible_pipelining' from source: unknown 30575 1726867593.33033: variable 'ansible_timeout' from source: unknown 30575 1726867593.33035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.33037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.33039: variable 'omit' from source: magic vars 30575 1726867593.33188: starting attempt loop 30575 1726867593.33196: running the handler 30575 1726867593.33382: variable 'lsr_test' from source: include params 30575 1726867593.33460: variable 'lsr_test' from source: include params 30575 1726867593.33482: handler run complete 30575 1726867593.33499: attempt loop complete, returning result 30575 1726867593.33573: variable 'item' from source: unknown 30575 1726867593.33636: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 30575 1726867593.34054: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.34058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.34384: variable 'omit' from source: magic vars 30575 1726867593.34654: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.34938: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.34942: variable 'omit' from source: magic vars 30575 1726867593.34944: variable 'omit' from source: magic vars 30575 1726867593.34946: variable 'item' from source: unknown 30575 1726867593.35382: variable 'item' from source: unknown 30575 1726867593.35387: variable 'omit' from source: magic vars 30575 1726867593.35389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.35392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.35394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.35396: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.35398: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.35400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.35563: Set connection var ansible_pipelining to False 30575 1726867593.35633: Set connection var ansible_shell_type to sh 30575 1726867593.35645: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.35655: Set connection var ansible_timeout to 10 30575 1726867593.35842: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.35845: Set connection var ansible_connection to ssh 30575 1726867593.35848: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.35850: variable 'ansible_connection' from source: unknown 30575 1726867593.35852: variable 'ansible_module_compression' from source: unknown 30575 1726867593.35854: variable 'ansible_shell_type' from source: unknown 30575 1726867593.35856: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.35858: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.36062: variable 'ansible_pipelining' from source: unknown 30575 1726867593.36065: variable 'ansible_timeout' from source: unknown 30575 1726867593.36068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.36155: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.36483: variable 'omit' from source: magic vars 30575 1726867593.36486: starting attempt loop 30575 1726867593.36489: running the handler 30575 1726867593.36491: variable 'lsr_assert' from source: include params 30575 1726867593.36522: variable 'lsr_assert' from source: include params 30575 1726867593.36604: handler run complete 30575 1726867593.36627: attempt loop complete, returning result 30575 1726867593.36833: variable 'item' from source: unknown 30575 1726867593.36836: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 30575 1726867593.37082: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.37269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.37273: variable 'omit' from source: magic vars 30575 1726867593.37686: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.37690: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.37697: variable 'omit' from source: magic vars 30575 1726867593.37701: variable 'omit' from source: magic vars 30575 1726867593.37715: variable 'item' from source: unknown 30575 1726867593.37781: variable 'item' from source: unknown 30575 1726867593.38301: variable 'omit' from source: magic vars 30575 1726867593.38317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.38328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.38336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.38348: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.38351: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.38354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.38422: Set connection var ansible_pipelining to False 30575 1726867593.38425: Set connection var ansible_shell_type to sh 30575 1726867593.38433: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.38440: Set connection var ansible_timeout to 10 30575 1726867593.38442: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.38451: Set connection var ansible_connection to ssh 30575 1726867593.38471: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.38475: variable 'ansible_connection' from source: unknown 30575 1726867593.38479: variable 'ansible_module_compression' from source: unknown 30575 1726867593.38481: variable 'ansible_shell_type' from source: unknown 30575 1726867593.38804: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.38807: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.38809: variable 'ansible_pipelining' from source: unknown 30575 1726867593.38814: variable 'ansible_timeout' from source: unknown 30575 1726867593.38818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.38909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.38917: variable 'omit' from source: magic vars 30575 1726867593.38920: starting attempt loop 30575 1726867593.38922: running the handler 30575 1726867593.39539: handler run complete 30575 1726867593.39551: attempt loop complete, returning result 30575 1726867593.39566: variable 'item' from source: unknown 30575 1726867593.39626: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30575 1726867593.40110: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.40117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.40128: variable 'omit' from source: magic vars 30575 1726867593.40302: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.40555: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.40558: variable 'omit' from source: magic vars 30575 1726867593.40575: variable 'omit' from source: magic vars 30575 1726867593.40614: variable 'item' from source: unknown 30575 1726867593.40684: variable 'item' from source: unknown 30575 1726867593.41120: variable 'omit' from source: magic vars 30575 1726867593.41123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.41125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.41228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.41231: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.41233: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.41235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.41338: Set connection var ansible_pipelining to False 30575 1726867593.41341: Set connection var ansible_shell_type to sh 30575 1726867593.41347: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.41353: Set connection var ansible_timeout to 10 30575 1726867593.41445: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.41448: Set connection var ansible_connection to ssh 30575 1726867593.41741: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.41744: variable 'ansible_connection' from source: unknown 30575 1726867593.41746: variable 'ansible_module_compression' from source: unknown 30575 1726867593.41748: variable 'ansible_shell_type' from source: unknown 30575 1726867593.41750: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.41752: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.41754: variable 'ansible_pipelining' from source: unknown 30575 1726867593.41757: variable 'ansible_timeout' from source: unknown 30575 1726867593.41759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.41805: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.41812: variable 'omit' from source: magic vars 30575 1726867593.41815: starting attempt loop 30575 1726867593.41817: running the handler 30575 1726867593.41849: variable 'lsr_fail_debug' from source: play vars 30575 1726867593.41901: variable 'lsr_fail_debug' from source: play vars 30575 1726867593.41917: handler run complete 30575 1726867593.41932: attempt loop complete, returning result 30575 1726867593.41956: variable 'item' from source: unknown 30575 1726867593.42209: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30575 1726867593.42502: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.42505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.42508: variable 'omit' from source: magic vars 30575 1726867593.42663: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.42666: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.42669: variable 'omit' from source: magic vars 30575 1726867593.42716: variable 'omit' from source: magic vars 30575 1726867593.42719: variable 'item' from source: unknown 30575 1726867593.42947: variable 'item' from source: unknown 30575 1726867593.42958: variable 'omit' from source: magic vars 30575 1726867593.42981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.42990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.42993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.43002: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.43004: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.43055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.43303: Set connection var ansible_pipelining to False 30575 1726867593.43307: Set connection var ansible_shell_type to sh 30575 1726867593.43316: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.43319: Set connection var ansible_timeout to 10 30575 1726867593.43321: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.43331: Set connection var ansible_connection to ssh 30575 1726867593.43367: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.43370: variable 'ansible_connection' from source: unknown 30575 1726867593.43372: variable 'ansible_module_compression' from source: unknown 30575 1726867593.43379: variable 'ansible_shell_type' from source: unknown 30575 1726867593.43381: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.43383: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.43385: variable 'ansible_pipelining' from source: unknown 30575 1726867593.43387: variable 'ansible_timeout' from source: unknown 30575 1726867593.43388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.43693: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.43701: variable 'omit' from source: magic vars 30575 1726867593.43703: starting attempt loop 30575 1726867593.43706: running the handler 30575 1726867593.43914: variable 'lsr_cleanup' from source: include params 30575 1726867593.44106: variable 'lsr_cleanup' from source: include params 30575 1726867593.44130: handler run complete 30575 1726867593.44295: attempt loop complete, returning result 30575 1726867593.44299: variable 'item' from source: unknown 30575 1726867593.44389: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30575 1726867593.44570: dumping result to json 30575 1726867593.44573: done dumping result, returning 30575 1726867593.44575: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-e081-a588-000000000a4a] 30575 1726867593.44580: sending task result for task 0affcac9-a3a5-e081-a588-000000000a4a 30575 1726867593.44884: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a4a 30575 1726867593.44999: no more pending results, returning what we have 30575 1726867593.45003: results queue empty 30575 1726867593.45004: checking for any_errors_fatal 30575 1726867593.45013: done checking for any_errors_fatal 30575 1726867593.45014: checking for max_fail_percentage 30575 1726867593.45015: done checking for max_fail_percentage 30575 1726867593.45016: checking to see if all hosts have failed and the running result is not ok 30575 1726867593.45017: done checking to see if all hosts have failed 30575 1726867593.45018: getting the remaining hosts for this loop 30575 1726867593.45020: done getting the remaining hosts for this loop 30575 1726867593.45024: getting the next task for host managed_node3 30575 1726867593.45033: done getting next task for host managed_node3 30575 1726867593.45036: ^ task is: TASK: Include the task 'show_interfaces.yml' 30575 1726867593.45040: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867593.45043: getting variables 30575 1726867593.45045: in VariableManager get_vars() 30575 1726867593.45082: Calling all_inventory to load vars for managed_node3 30575 1726867593.45085: Calling groups_inventory to load vars for managed_node3 30575 1726867593.45088: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867593.45099: Calling all_plugins_play to load vars for managed_node3 30575 1726867593.45101: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867593.45104: Calling groups_plugins_play to load vars for managed_node3 30575 1726867593.45734: WORKER PROCESS EXITING 30575 1726867593.48257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.52209: done with get_vars() 30575 1726867593.52235: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:26:33 -0400 (0:00:00.302) 0:00:28.902 ****** 30575 1726867593.52446: entering _queue_task() for managed_node3/include_tasks 30575 1726867593.53066: worker is 1 (out of 1 available) 30575 1726867593.53083: exiting _queue_task() for managed_node3/include_tasks 30575 1726867593.53097: done queuing things up, now waiting for results queue to drain 30575 1726867593.53099: waiting for pending results... 30575 1726867593.53904: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30575 1726867593.54040: in run() - task 0affcac9-a3a5-e081-a588-000000000a4b 30575 1726867593.54062: variable 'ansible_search_path' from source: unknown 30575 1726867593.54112: variable 'ansible_search_path' from source: unknown 30575 1726867593.54241: calling self._execute() 30575 1726867593.54407: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.54441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.54789: variable 'omit' from source: magic vars 30575 1726867593.55833: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.55853: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.55865: _execute() done 30575 1726867593.55874: dumping result to json 30575 1726867593.56122: done dumping result, returning 30575 1726867593.56126: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-e081-a588-000000000a4b] 30575 1726867593.56128: sending task result for task 0affcac9-a3a5-e081-a588-000000000a4b 30575 1726867593.56205: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a4b 30575 1726867593.56208: WORKER PROCESS EXITING 30575 1726867593.56246: no more pending results, returning what we have 30575 1726867593.56252: in VariableManager get_vars() 30575 1726867593.56300: Calling all_inventory to load vars for managed_node3 30575 1726867593.56303: Calling groups_inventory to load vars for managed_node3 30575 1726867593.56307: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867593.56324: Calling all_plugins_play to load vars for managed_node3 30575 1726867593.56328: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867593.56332: Calling groups_plugins_play to load vars for managed_node3 30575 1726867593.59605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.62194: done with get_vars() 30575 1726867593.62216: variable 'ansible_search_path' from source: unknown 30575 1726867593.62218: variable 'ansible_search_path' from source: unknown 30575 1726867593.62259: we have included files to process 30575 1726867593.62260: generating all_blocks data 30575 1726867593.62262: done generating all_blocks data 30575 1726867593.62268: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867593.62269: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867593.62271: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867593.62403: in VariableManager get_vars() 30575 1726867593.62426: done with get_vars() 30575 1726867593.62550: done processing included file 30575 1726867593.62552: iterating over new_blocks loaded from include file 30575 1726867593.62554: in VariableManager get_vars() 30575 1726867593.62566: done with get_vars() 30575 1726867593.62568: filtering new block on tags 30575 1726867593.62606: done filtering new block on tags 30575 1726867593.62609: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30575 1726867593.62614: extending task lists for all hosts with included blocks 30575 1726867593.63343: done extending task lists 30575 1726867593.63344: done processing included files 30575 1726867593.63345: results queue empty 30575 1726867593.63346: checking for any_errors_fatal 30575 1726867593.63352: done checking for any_errors_fatal 30575 1726867593.63353: checking for max_fail_percentage 30575 1726867593.63354: done checking for max_fail_percentage 30575 1726867593.63354: checking to see if all hosts have failed and the running result is not ok 30575 1726867593.63355: done checking to see if all hosts have failed 30575 1726867593.63356: getting the remaining hosts for this loop 30575 1726867593.63357: done getting the remaining hosts for this loop 30575 1726867593.63360: getting the next task for host managed_node3 30575 1726867593.63364: done getting next task for host managed_node3 30575 1726867593.63366: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30575 1726867593.63369: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867593.63372: getting variables 30575 1726867593.63373: in VariableManager get_vars() 30575 1726867593.63487: Calling all_inventory to load vars for managed_node3 30575 1726867593.63489: Calling groups_inventory to load vars for managed_node3 30575 1726867593.63492: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867593.63501: Calling all_plugins_play to load vars for managed_node3 30575 1726867593.63504: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867593.63507: Calling groups_plugins_play to load vars for managed_node3 30575 1726867593.65311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.69566: done with get_vars() 30575 1726867593.69680: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:26:33 -0400 (0:00:00.173) 0:00:29.075 ****** 30575 1726867593.69767: entering _queue_task() for managed_node3/include_tasks 30575 1726867593.70629: worker is 1 (out of 1 available) 30575 1726867593.70642: exiting _queue_task() for managed_node3/include_tasks 30575 1726867593.70655: done queuing things up, now waiting for results queue to drain 30575 1726867593.70656: waiting for pending results... 30575 1726867593.71482: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30575 1726867593.71817: in run() - task 0affcac9-a3a5-e081-a588-000000000a72 30575 1726867593.71832: variable 'ansible_search_path' from source: unknown 30575 1726867593.71836: variable 'ansible_search_path' from source: unknown 30575 1726867593.71991: calling self._execute() 30575 1726867593.72199: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.72204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.72217: variable 'omit' from source: magic vars 30575 1726867593.72980: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.72993: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.73004: _execute() done 30575 1726867593.73007: dumping result to json 30575 1726867593.73009: done dumping result, returning 30575 1726867593.73013: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-e081-a588-000000000a72] 30575 1726867593.73019: sending task result for task 0affcac9-a3a5-e081-a588-000000000a72 30575 1726867593.73337: no more pending results, returning what we have 30575 1726867593.73343: in VariableManager get_vars() 30575 1726867593.73386: Calling all_inventory to load vars for managed_node3 30575 1726867593.73388: Calling groups_inventory to load vars for managed_node3 30575 1726867593.73392: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867593.73405: Calling all_plugins_play to load vars for managed_node3 30575 1726867593.73408: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867593.73410: Calling groups_plugins_play to load vars for managed_node3 30575 1726867593.74033: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a72 30575 1726867593.74038: WORKER PROCESS EXITING 30575 1726867593.76427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.79249: done with get_vars() 30575 1726867593.79269: variable 'ansible_search_path' from source: unknown 30575 1726867593.79270: variable 'ansible_search_path' from source: unknown 30575 1726867593.79309: we have included files to process 30575 1726867593.79310: generating all_blocks data 30575 1726867593.79312: done generating all_blocks data 30575 1726867593.79313: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867593.79314: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867593.79316: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867593.79744: done processing included file 30575 1726867593.79747: iterating over new_blocks loaded from include file 30575 1726867593.79749: in VariableManager get_vars() 30575 1726867593.79768: done with get_vars() 30575 1726867593.79770: filtering new block on tags 30575 1726867593.79849: done filtering new block on tags 30575 1726867593.79852: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30575 1726867593.79857: extending task lists for all hosts with included blocks 30575 1726867593.80218: done extending task lists 30575 1726867593.80220: done processing included files 30575 1726867593.80221: results queue empty 30575 1726867593.80221: checking for any_errors_fatal 30575 1726867593.80223: done checking for any_errors_fatal 30575 1726867593.80224: checking for max_fail_percentage 30575 1726867593.80225: done checking for max_fail_percentage 30575 1726867593.80226: checking to see if all hosts have failed and the running result is not ok 30575 1726867593.80227: done checking to see if all hosts have failed 30575 1726867593.80227: getting the remaining hosts for this loop 30575 1726867593.80229: done getting the remaining hosts for this loop 30575 1726867593.80231: getting the next task for host managed_node3 30575 1726867593.80235: done getting next task for host managed_node3 30575 1726867593.80237: ^ task is: TASK: Gather current interface info 30575 1726867593.80241: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867593.80242: getting variables 30575 1726867593.80243: in VariableManager get_vars() 30575 1726867593.80252: Calling all_inventory to load vars for managed_node3 30575 1726867593.80254: Calling groups_inventory to load vars for managed_node3 30575 1726867593.80256: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867593.80261: Calling all_plugins_play to load vars for managed_node3 30575 1726867593.80263: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867593.80265: Calling groups_plugins_play to load vars for managed_node3 30575 1726867593.82846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867593.84645: done with get_vars() 30575 1726867593.84672: done getting variables 30575 1726867593.84717: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:26:33 -0400 (0:00:00.149) 0:00:29.225 ****** 30575 1726867593.84747: entering _queue_task() for managed_node3/command 30575 1726867593.85223: worker is 1 (out of 1 available) 30575 1726867593.85236: exiting _queue_task() for managed_node3/command 30575 1726867593.85247: done queuing things up, now waiting for results queue to drain 30575 1726867593.85249: waiting for pending results... 30575 1726867593.85499: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30575 1726867593.85575: in run() - task 0affcac9-a3a5-e081-a588-000000000aad 30575 1726867593.85593: variable 'ansible_search_path' from source: unknown 30575 1726867593.85598: variable 'ansible_search_path' from source: unknown 30575 1726867593.85633: calling self._execute() 30575 1726867593.85731: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.85786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.85790: variable 'omit' from source: magic vars 30575 1726867593.86185: variable 'ansible_distribution_major_version' from source: facts 30575 1726867593.86189: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867593.86191: variable 'omit' from source: magic vars 30575 1726867593.86396: variable 'omit' from source: magic vars 30575 1726867593.86401: variable 'omit' from source: magic vars 30575 1726867593.86403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867593.86406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867593.86408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867593.86482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.86485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867593.86488: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867593.86490: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.86492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.86634: Set connection var ansible_pipelining to False 30575 1726867593.86644: Set connection var ansible_shell_type to sh 30575 1726867593.86653: Set connection var ansible_shell_executable to /bin/sh 30575 1726867593.86664: Set connection var ansible_timeout to 10 30575 1726867593.86673: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867593.86687: Set connection var ansible_connection to ssh 30575 1726867593.86714: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.86742: variable 'ansible_connection' from source: unknown 30575 1726867593.86746: variable 'ansible_module_compression' from source: unknown 30575 1726867593.86834: variable 'ansible_shell_type' from source: unknown 30575 1726867593.86837: variable 'ansible_shell_executable' from source: unknown 30575 1726867593.86839: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867593.86841: variable 'ansible_pipelining' from source: unknown 30575 1726867593.86850: variable 'ansible_timeout' from source: unknown 30575 1726867593.86853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867593.86945: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867593.86972: variable 'omit' from source: magic vars 30575 1726867593.86986: starting attempt loop 30575 1726867593.86993: running the handler 30575 1726867593.87014: _low_level_execute_command(): starting 30575 1726867593.87027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867593.87835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867593.87847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867593.87860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867593.87938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867593.88094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867593.88121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867593.88157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867593.88247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867593.90476: stdout chunk (state=3): >>>/root <<< 30575 1726867593.90564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867593.90569: stdout chunk (state=3): >>><<< 30575 1726867593.90574: stderr chunk (state=3): >>><<< 30575 1726867593.90579: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867593.90584: _low_level_execute_command(): starting 30575 1726867593.90688: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708 `" && echo ansible-tmp-1726867593.9054284-31924-174133984133708="` echo /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708 `" ) && sleep 0' 30575 1726867593.92196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867593.92318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867593.92376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867593.92432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867593.92730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867593.94623: stdout chunk (state=3): >>>ansible-tmp-1726867593.9054284-31924-174133984133708=/root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708 <<< 30575 1726867593.94982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867593.94985: stdout chunk (state=3): >>><<< 30575 1726867593.94987: stderr chunk (state=3): >>><<< 30575 1726867593.94990: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867593.9054284-31924-174133984133708=/root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867593.94992: variable 'ansible_module_compression' from source: unknown 30575 1726867593.94994: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867593.95001: variable 'ansible_facts' from source: unknown 30575 1726867593.95049: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/AnsiballZ_command.py 30575 1726867593.95293: Sending initial data 30575 1726867593.95296: Sent initial data (156 bytes) 30575 1726867593.95872: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867593.95893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867593.95964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867593.96003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867593.96016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867593.96076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867593.96196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867593.98240: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867593.98334: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpeqgsz0x9 /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/AnsiballZ_command.py <<< 30575 1726867593.98337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/AnsiballZ_command.py" <<< 30575 1726867593.98390: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpeqgsz0x9" to remote "/root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/AnsiballZ_command.py" <<< 30575 1726867593.99796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867593.99900: stderr chunk (state=3): >>><<< 30575 1726867593.99911: stdout chunk (state=3): >>><<< 30575 1726867593.99950: done transferring module to remote 30575 1726867593.99953: _low_level_execute_command(): starting 30575 1726867593.99956: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/ /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/AnsiballZ_command.py && sleep 0' 30575 1726867594.00664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867594.00669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867594.00675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867594.00700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867594.00718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867594.00721: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867594.00775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867594.00788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867594.01198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867594.02996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867594.03202: stderr chunk (state=3): >>><<< 30575 1726867594.03207: stdout chunk (state=3): >>><<< 30575 1726867594.03211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867594.03223: _low_level_execute_command(): starting 30575 1726867594.03227: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/AnsiballZ_command.py && sleep 0' 30575 1726867594.04236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867594.04430: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867594.04450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867594.04517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867594.04533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867594.04657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867594.20120: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:34.195341", "end": "2024-09-20 17:26:34.198887", "delta": "0:00:00.003546", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867594.21756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867594.21760: stderr chunk (state=3): >>><<< 30575 1726867594.21763: stdout chunk (state=3): >>><<< 30575 1726867594.21774: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:34.195341", "end": "2024-09-20 17:26:34.198887", "delta": "0:00:00.003546", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867594.21862: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867594.21867: _low_level_execute_command(): starting 30575 1726867594.21869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867593.9054284-31924-174133984133708/ > /dev/null 2>&1 && sleep 0' 30575 1726867594.22593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867594.22597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867594.22599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867594.22602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867594.22639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867594.22681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867594.22746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867594.24558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867594.24584: stderr chunk (state=3): >>><<< 30575 1726867594.24588: stdout chunk (state=3): >>><<< 30575 1726867594.24614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867594.24618: handler run complete 30575 1726867594.24639: Evaluated conditional (False): False 30575 1726867594.24647: attempt loop complete, returning result 30575 1726867594.24651: _execute() done 30575 1726867594.24654: dumping result to json 30575 1726867594.24659: done dumping result, returning 30575 1726867594.24667: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-e081-a588-000000000aad] 30575 1726867594.24672: sending task result for task 0affcac9-a3a5-e081-a588-000000000aad 30575 1726867594.24766: done sending task result for task 0affcac9-a3a5-e081-a588-000000000aad 30575 1726867594.24769: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003546", "end": "2024-09-20 17:26:34.198887", "rc": 0, "start": "2024-09-20 17:26:34.195341" } STDOUT: bonding_masters eth0 lo 30575 1726867594.24844: no more pending results, returning what we have 30575 1726867594.24848: results queue empty 30575 1726867594.24848: checking for any_errors_fatal 30575 1726867594.24850: done checking for any_errors_fatal 30575 1726867594.24851: checking for max_fail_percentage 30575 1726867594.24852: done checking for max_fail_percentage 30575 1726867594.24853: checking to see if all hosts have failed and the running result is not ok 30575 1726867594.24854: done checking to see if all hosts have failed 30575 1726867594.24855: getting the remaining hosts for this loop 30575 1726867594.24856: done getting the remaining hosts for this loop 30575 1726867594.24860: getting the next task for host managed_node3 30575 1726867594.24870: done getting next task for host managed_node3 30575 1726867594.24872: ^ task is: TASK: Set current_interfaces 30575 1726867594.24886: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867594.24892: getting variables 30575 1726867594.24893: in VariableManager get_vars() 30575 1726867594.24926: Calling all_inventory to load vars for managed_node3 30575 1726867594.24929: Calling groups_inventory to load vars for managed_node3 30575 1726867594.24933: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.24943: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.24945: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.24948: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.26004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.32056: done with get_vars() 30575 1726867594.32075: done getting variables 30575 1726867594.32120: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:26:34 -0400 (0:00:00.473) 0:00:29.698 ****** 30575 1726867594.32141: entering _queue_task() for managed_node3/set_fact 30575 1726867594.32416: worker is 1 (out of 1 available) 30575 1726867594.32432: exiting _queue_task() for managed_node3/set_fact 30575 1726867594.32446: done queuing things up, now waiting for results queue to drain 30575 1726867594.32448: waiting for pending results... 30575 1726867594.32638: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30575 1726867594.32728: in run() - task 0affcac9-a3a5-e081-a588-000000000aae 30575 1726867594.32736: variable 'ansible_search_path' from source: unknown 30575 1726867594.32740: variable 'ansible_search_path' from source: unknown 30575 1726867594.32768: calling self._execute() 30575 1726867594.32843: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.32849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.32860: variable 'omit' from source: magic vars 30575 1726867594.33138: variable 'ansible_distribution_major_version' from source: facts 30575 1726867594.33148: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867594.33154: variable 'omit' from source: magic vars 30575 1726867594.33189: variable 'omit' from source: magic vars 30575 1726867594.33264: variable '_current_interfaces' from source: set_fact 30575 1726867594.33314: variable 'omit' from source: magic vars 30575 1726867594.33347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867594.33374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867594.33392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867594.33405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867594.33415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867594.33442: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867594.33445: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.33448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.33682: Set connection var ansible_pipelining to False 30575 1726867594.33685: Set connection var ansible_shell_type to sh 30575 1726867594.33688: Set connection var ansible_shell_executable to /bin/sh 30575 1726867594.33690: Set connection var ansible_timeout to 10 30575 1726867594.33692: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867594.33695: Set connection var ansible_connection to ssh 30575 1726867594.33698: variable 'ansible_shell_executable' from source: unknown 30575 1726867594.33701: variable 'ansible_connection' from source: unknown 30575 1726867594.33703: variable 'ansible_module_compression' from source: unknown 30575 1726867594.33705: variable 'ansible_shell_type' from source: unknown 30575 1726867594.33708: variable 'ansible_shell_executable' from source: unknown 30575 1726867594.33710: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.33712: variable 'ansible_pipelining' from source: unknown 30575 1726867594.33714: variable 'ansible_timeout' from source: unknown 30575 1726867594.33716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.33733: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867594.33741: variable 'omit' from source: magic vars 30575 1726867594.33746: starting attempt loop 30575 1726867594.33749: running the handler 30575 1726867594.33761: handler run complete 30575 1726867594.33770: attempt loop complete, returning result 30575 1726867594.33773: _execute() done 30575 1726867594.33776: dumping result to json 30575 1726867594.33780: done dumping result, returning 30575 1726867594.33817: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-e081-a588-000000000aae] 30575 1726867594.33822: sending task result for task 0affcac9-a3a5-e081-a588-000000000aae ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30575 1726867594.34100: no more pending results, returning what we have 30575 1726867594.34105: results queue empty 30575 1726867594.34106: checking for any_errors_fatal 30575 1726867594.34115: done checking for any_errors_fatal 30575 1726867594.34115: checking for max_fail_percentage 30575 1726867594.34117: done checking for max_fail_percentage 30575 1726867594.34118: checking to see if all hosts have failed and the running result is not ok 30575 1726867594.34119: done checking to see if all hosts have failed 30575 1726867594.34120: getting the remaining hosts for this loop 30575 1726867594.34121: done getting the remaining hosts for this loop 30575 1726867594.34131: getting the next task for host managed_node3 30575 1726867594.34142: done getting next task for host managed_node3 30575 1726867594.34144: ^ task is: TASK: Show current_interfaces 30575 1726867594.34151: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867594.34156: getting variables 30575 1726867594.34157: in VariableManager get_vars() 30575 1726867594.34199: Calling all_inventory to load vars for managed_node3 30575 1726867594.34202: Calling groups_inventory to load vars for managed_node3 30575 1726867594.34206: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.34217: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.34220: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.34226: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.34793: done sending task result for task 0affcac9-a3a5-e081-a588-000000000aae 30575 1726867594.34796: WORKER PROCESS EXITING 30575 1726867594.36672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.37666: done with get_vars() 30575 1726867594.37685: done getting variables 30575 1726867594.37728: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:26:34 -0400 (0:00:00.056) 0:00:29.755 ****** 30575 1726867594.37750: entering _queue_task() for managed_node3/debug 30575 1726867594.38511: worker is 1 (out of 1 available) 30575 1726867594.38519: exiting _queue_task() for managed_node3/debug 30575 1726867594.38531: done queuing things up, now waiting for results queue to drain 30575 1726867594.38532: waiting for pending results... 30575 1726867594.38631: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30575 1726867594.38635: in run() - task 0affcac9-a3a5-e081-a588-000000000a73 30575 1726867594.38638: variable 'ansible_search_path' from source: unknown 30575 1726867594.38641: variable 'ansible_search_path' from source: unknown 30575 1726867594.38730: calling self._execute() 30575 1726867594.38886: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.38889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.38892: variable 'omit' from source: magic vars 30575 1726867594.39605: variable 'ansible_distribution_major_version' from source: facts 30575 1726867594.39864: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867594.39867: variable 'omit' from source: magic vars 30575 1726867594.39870: variable 'omit' from source: magic vars 30575 1726867594.40030: variable 'current_interfaces' from source: set_fact 30575 1726867594.40065: variable 'omit' from source: magic vars 30575 1726867594.40116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867594.40236: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867594.40268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867594.40303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867594.40370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867594.40465: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867594.40469: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.40476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.40783: Set connection var ansible_pipelining to False 30575 1726867594.40788: Set connection var ansible_shell_type to sh 30575 1726867594.40790: Set connection var ansible_shell_executable to /bin/sh 30575 1726867594.40793: Set connection var ansible_timeout to 10 30575 1726867594.40795: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867594.40797: Set connection var ansible_connection to ssh 30575 1726867594.40823: variable 'ansible_shell_executable' from source: unknown 30575 1726867594.40962: variable 'ansible_connection' from source: unknown 30575 1726867594.40965: variable 'ansible_module_compression' from source: unknown 30575 1726867594.40967: variable 'ansible_shell_type' from source: unknown 30575 1726867594.40969: variable 'ansible_shell_executable' from source: unknown 30575 1726867594.40970: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.40973: variable 'ansible_pipelining' from source: unknown 30575 1726867594.40975: variable 'ansible_timeout' from source: unknown 30575 1726867594.40978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.41191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867594.41284: variable 'omit' from source: magic vars 30575 1726867594.41287: starting attempt loop 30575 1726867594.41289: running the handler 30575 1726867594.41484: handler run complete 30575 1726867594.41489: attempt loop complete, returning result 30575 1726867594.41492: _execute() done 30575 1726867594.41496: dumping result to json 30575 1726867594.41499: done dumping result, returning 30575 1726867594.41506: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-e081-a588-000000000a73] 30575 1726867594.41508: sending task result for task 0affcac9-a3a5-e081-a588-000000000a73 30575 1726867594.41576: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a73 ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30575 1726867594.41658: no more pending results, returning what we have 30575 1726867594.41661: results queue empty 30575 1726867594.41662: checking for any_errors_fatal 30575 1726867594.41669: done checking for any_errors_fatal 30575 1726867594.41670: checking for max_fail_percentage 30575 1726867594.41671: done checking for max_fail_percentage 30575 1726867594.41672: checking to see if all hosts have failed and the running result is not ok 30575 1726867594.41673: done checking to see if all hosts have failed 30575 1726867594.41674: getting the remaining hosts for this loop 30575 1726867594.41675: done getting the remaining hosts for this loop 30575 1726867594.41680: getting the next task for host managed_node3 30575 1726867594.41687: done getting next task for host managed_node3 30575 1726867594.41691: ^ task is: TASK: Setup 30575 1726867594.41693: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867594.41697: getting variables 30575 1726867594.41699: in VariableManager get_vars() 30575 1726867594.41731: Calling all_inventory to load vars for managed_node3 30575 1726867594.41734: Calling groups_inventory to load vars for managed_node3 30575 1726867594.41737: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.41747: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.41750: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.41752: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.42890: WORKER PROCESS EXITING 30575 1726867594.44649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.47788: done with get_vars() 30575 1726867594.47814: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:26:34 -0400 (0:00:00.103) 0:00:29.858 ****** 30575 1726867594.48111: entering _queue_task() for managed_node3/include_tasks 30575 1726867594.48876: worker is 1 (out of 1 available) 30575 1726867594.48891: exiting _queue_task() for managed_node3/include_tasks 30575 1726867594.48904: done queuing things up, now waiting for results queue to drain 30575 1726867594.48906: waiting for pending results... 30575 1726867594.49432: running TaskExecutor() for managed_node3/TASK: Setup 30575 1726867594.49616: in run() - task 0affcac9-a3a5-e081-a588-000000000a4c 30575 1726867594.49711: variable 'ansible_search_path' from source: unknown 30575 1726867594.49739: variable 'ansible_search_path' from source: unknown 30575 1726867594.49952: variable 'lsr_setup' from source: include params 30575 1726867594.50286: variable 'lsr_setup' from source: include params 30575 1726867594.50484: variable 'omit' from source: magic vars 30575 1726867594.50715: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.50835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.50840: variable 'omit' from source: magic vars 30575 1726867594.51589: variable 'ansible_distribution_major_version' from source: facts 30575 1726867594.51592: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867594.51595: variable 'item' from source: unknown 30575 1726867594.51660: variable 'item' from source: unknown 30575 1726867594.51826: variable 'item' from source: unknown 30575 1726867594.51892: variable 'item' from source: unknown 30575 1726867594.52386: dumping result to json 30575 1726867594.52390: done dumping result, returning 30575 1726867594.52393: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-e081-a588-000000000a4c] 30575 1726867594.52396: sending task result for task 0affcac9-a3a5-e081-a588-000000000a4c 30575 1726867594.52556: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a4c 30575 1726867594.52561: WORKER PROCESS EXITING 30575 1726867594.52587: no more pending results, returning what we have 30575 1726867594.52593: in VariableManager get_vars() 30575 1726867594.52632: Calling all_inventory to load vars for managed_node3 30575 1726867594.52635: Calling groups_inventory to load vars for managed_node3 30575 1726867594.52639: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.52651: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.52654: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.52657: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.54901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.57635: done with get_vars() 30575 1726867594.57651: variable 'ansible_search_path' from source: unknown 30575 1726867594.57652: variable 'ansible_search_path' from source: unknown 30575 1726867594.57681: we have included files to process 30575 1726867594.57682: generating all_blocks data 30575 1726867594.57684: done generating all_blocks data 30575 1726867594.57686: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867594.57687: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867594.57688: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867594.57864: done processing included file 30575 1726867594.57865: iterating over new_blocks loaded from include file 30575 1726867594.57866: in VariableManager get_vars() 30575 1726867594.57876: done with get_vars() 30575 1726867594.57879: filtering new block on tags 30575 1726867594.57901: done filtering new block on tags 30575 1726867594.57902: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30575 1726867594.57906: extending task lists for all hosts with included blocks 30575 1726867594.58243: done extending task lists 30575 1726867594.58244: done processing included files 30575 1726867594.58244: results queue empty 30575 1726867594.58245: checking for any_errors_fatal 30575 1726867594.58247: done checking for any_errors_fatal 30575 1726867594.58248: checking for max_fail_percentage 30575 1726867594.58249: done checking for max_fail_percentage 30575 1726867594.58249: checking to see if all hosts have failed and the running result is not ok 30575 1726867594.58250: done checking to see if all hosts have failed 30575 1726867594.58250: getting the remaining hosts for this loop 30575 1726867594.58251: done getting the remaining hosts for this loop 30575 1726867594.58253: getting the next task for host managed_node3 30575 1726867594.58256: done getting next task for host managed_node3 30575 1726867594.58257: ^ task is: TASK: Include network role 30575 1726867594.58259: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867594.58261: getting variables 30575 1726867594.58261: in VariableManager get_vars() 30575 1726867594.58268: Calling all_inventory to load vars for managed_node3 30575 1726867594.58270: Calling groups_inventory to load vars for managed_node3 30575 1726867594.58271: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.58276: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.58279: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.58282: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.58972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.59830: done with get_vars() 30575 1726867594.59845: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 17:26:34 -0400 (0:00:00.117) 0:00:29.976 ****** 30575 1726867594.59914: entering _queue_task() for managed_node3/include_role 30575 1726867594.60269: worker is 1 (out of 1 available) 30575 1726867594.60383: exiting _queue_task() for managed_node3/include_role 30575 1726867594.60397: done queuing things up, now waiting for results queue to drain 30575 1726867594.60398: waiting for pending results... 30575 1726867594.60735: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867594.60910: in run() - task 0affcac9-a3a5-e081-a588-000000000ad1 30575 1726867594.60921: variable 'ansible_search_path' from source: unknown 30575 1726867594.60924: variable 'ansible_search_path' from source: unknown 30575 1726867594.60984: calling self._execute() 30575 1726867594.61127: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.61234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.61239: variable 'omit' from source: magic vars 30575 1726867594.61579: variable 'ansible_distribution_major_version' from source: facts 30575 1726867594.61602: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867594.61620: _execute() done 30575 1726867594.61630: dumping result to json 30575 1726867594.61643: done dumping result, returning 30575 1726867594.61661: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-000000000ad1] 30575 1726867594.61674: sending task result for task 0affcac9-a3a5-e081-a588-000000000ad1 30575 1726867594.61866: no more pending results, returning what we have 30575 1726867594.61873: in VariableManager get_vars() 30575 1726867594.61921: Calling all_inventory to load vars for managed_node3 30575 1726867594.61924: Calling groups_inventory to load vars for managed_node3 30575 1726867594.61928: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.61942: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.61947: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.61950: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.63451: done sending task result for task 0affcac9-a3a5-e081-a588-000000000ad1 30575 1726867594.63454: WORKER PROCESS EXITING 30575 1726867594.64464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.68560: done with get_vars() 30575 1726867594.68583: variable 'ansible_search_path' from source: unknown 30575 1726867594.68585: variable 'ansible_search_path' from source: unknown 30575 1726867594.68866: variable 'omit' from source: magic vars 30575 1726867594.68908: variable 'omit' from source: magic vars 30575 1726867594.68927: variable 'omit' from source: magic vars 30575 1726867594.68931: we have included files to process 30575 1726867594.68932: generating all_blocks data 30575 1726867594.68933: done generating all_blocks data 30575 1726867594.68935: processing included file: fedora.linux_system_roles.network 30575 1726867594.68956: in VariableManager get_vars() 30575 1726867594.68970: done with get_vars() 30575 1726867594.69201: in VariableManager get_vars() 30575 1726867594.69218: done with get_vars() 30575 1726867594.69255: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867594.69376: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867594.69594: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867594.70012: in VariableManager get_vars() 30575 1726867594.70031: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867594.72004: iterating over new_blocks loaded from include file 30575 1726867594.72006: in VariableManager get_vars() 30575 1726867594.72022: done with get_vars() 30575 1726867594.72027: filtering new block on tags 30575 1726867594.72322: done filtering new block on tags 30575 1726867594.72329: in VariableManager get_vars() 30575 1726867594.72343: done with get_vars() 30575 1726867594.72345: filtering new block on tags 30575 1726867594.72360: done filtering new block on tags 30575 1726867594.72361: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867594.72367: extending task lists for all hosts with included blocks 30575 1726867594.72529: done extending task lists 30575 1726867594.72531: done processing included files 30575 1726867594.72532: results queue empty 30575 1726867594.72532: checking for any_errors_fatal 30575 1726867594.72535: done checking for any_errors_fatal 30575 1726867594.72536: checking for max_fail_percentage 30575 1726867594.72537: done checking for max_fail_percentage 30575 1726867594.72538: checking to see if all hosts have failed and the running result is not ok 30575 1726867594.72539: done checking to see if all hosts have failed 30575 1726867594.72539: getting the remaining hosts for this loop 30575 1726867594.72541: done getting the remaining hosts for this loop 30575 1726867594.72543: getting the next task for host managed_node3 30575 1726867594.72548: done getting next task for host managed_node3 30575 1726867594.72551: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867594.72554: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867594.72563: getting variables 30575 1726867594.72564: in VariableManager get_vars() 30575 1726867594.72579: Calling all_inventory to load vars for managed_node3 30575 1726867594.72581: Calling groups_inventory to load vars for managed_node3 30575 1726867594.72583: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.72589: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.72592: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.72595: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.73748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.75364: done with get_vars() 30575 1726867594.75387: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:26:34 -0400 (0:00:00.155) 0:00:30.132 ****** 30575 1726867594.75463: entering _queue_task() for managed_node3/include_tasks 30575 1726867594.75833: worker is 1 (out of 1 available) 30575 1726867594.75847: exiting _queue_task() for managed_node3/include_tasks 30575 1726867594.75863: done queuing things up, now waiting for results queue to drain 30575 1726867594.75864: waiting for pending results... 30575 1726867594.76168: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867594.76326: in run() - task 0affcac9-a3a5-e081-a588-000000000b33 30575 1726867594.76351: variable 'ansible_search_path' from source: unknown 30575 1726867594.76360: variable 'ansible_search_path' from source: unknown 30575 1726867594.76404: calling self._execute() 30575 1726867594.76506: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.76525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.76541: variable 'omit' from source: magic vars 30575 1726867594.76918: variable 'ansible_distribution_major_version' from source: facts 30575 1726867594.76938: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867594.76952: _execute() done 30575 1726867594.76965: dumping result to json 30575 1726867594.76974: done dumping result, returning 30575 1726867594.76988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-000000000b33] 30575 1726867594.76999: sending task result for task 0affcac9-a3a5-e081-a588-000000000b33 30575 1726867594.77184: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b33 30575 1726867594.77187: WORKER PROCESS EXITING 30575 1726867594.77233: no more pending results, returning what we have 30575 1726867594.77239: in VariableManager get_vars() 30575 1726867594.77288: Calling all_inventory to load vars for managed_node3 30575 1726867594.77292: Calling groups_inventory to load vars for managed_node3 30575 1726867594.77295: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.77309: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.77312: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.77315: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.79220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.81285: done with get_vars() 30575 1726867594.81307: variable 'ansible_search_path' from source: unknown 30575 1726867594.81308: variable 'ansible_search_path' from source: unknown 30575 1726867594.81348: we have included files to process 30575 1726867594.81350: generating all_blocks data 30575 1726867594.81352: done generating all_blocks data 30575 1726867594.81355: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867594.81356: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867594.81358: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867594.81932: done processing included file 30575 1726867594.81934: iterating over new_blocks loaded from include file 30575 1726867594.81935: in VariableManager get_vars() 30575 1726867594.81958: done with get_vars() 30575 1726867594.81960: filtering new block on tags 30575 1726867594.81991: done filtering new block on tags 30575 1726867594.81994: in VariableManager get_vars() 30575 1726867594.82015: done with get_vars() 30575 1726867594.82017: filtering new block on tags 30575 1726867594.82063: done filtering new block on tags 30575 1726867594.82066: in VariableManager get_vars() 30575 1726867594.82089: done with get_vars() 30575 1726867594.82091: filtering new block on tags 30575 1726867594.82137: done filtering new block on tags 30575 1726867594.82140: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867594.82145: extending task lists for all hosts with included blocks 30575 1726867594.84320: done extending task lists 30575 1726867594.84322: done processing included files 30575 1726867594.84322: results queue empty 30575 1726867594.84326: checking for any_errors_fatal 30575 1726867594.84329: done checking for any_errors_fatal 30575 1726867594.84330: checking for max_fail_percentage 30575 1726867594.84331: done checking for max_fail_percentage 30575 1726867594.84331: checking to see if all hosts have failed and the running result is not ok 30575 1726867594.84332: done checking to see if all hosts have failed 30575 1726867594.84333: getting the remaining hosts for this loop 30575 1726867594.84334: done getting the remaining hosts for this loop 30575 1726867594.84337: getting the next task for host managed_node3 30575 1726867594.84342: done getting next task for host managed_node3 30575 1726867594.84345: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867594.84348: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867594.84359: getting variables 30575 1726867594.84360: in VariableManager get_vars() 30575 1726867594.84373: Calling all_inventory to load vars for managed_node3 30575 1726867594.84375: Calling groups_inventory to load vars for managed_node3 30575 1726867594.84396: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867594.84402: Calling all_plugins_play to load vars for managed_node3 30575 1726867594.84404: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867594.84407: Calling groups_plugins_play to load vars for managed_node3 30575 1726867594.86767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867594.90546: done with get_vars() 30575 1726867594.90575: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:26:34 -0400 (0:00:00.152) 0:00:30.284 ****** 30575 1726867594.90672: entering _queue_task() for managed_node3/setup 30575 1726867594.91866: worker is 1 (out of 1 available) 30575 1726867594.91882: exiting _queue_task() for managed_node3/setup 30575 1726867594.91898: done queuing things up, now waiting for results queue to drain 30575 1726867594.91899: waiting for pending results... 30575 1726867594.92971: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867594.93984: in run() - task 0affcac9-a3a5-e081-a588-000000000b90 30575 1726867594.93995: variable 'ansible_search_path' from source: unknown 30575 1726867594.93999: variable 'ansible_search_path' from source: unknown 30575 1726867594.94002: calling self._execute() 30575 1726867594.94066: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867594.94104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867594.94108: variable 'omit' from source: magic vars 30575 1726867594.95283: variable 'ansible_distribution_major_version' from source: facts 30575 1726867594.95287: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867594.96046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867595.00628: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867595.00708: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867595.00756: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867595.00805: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867595.00841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867595.00938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867595.01013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867595.01017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867595.01066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867595.01088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867595.01153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867595.01183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867595.01235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867595.01282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867595.01302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867595.01564: variable '__network_required_facts' from source: role '' defaults 30575 1726867595.01567: variable 'ansible_facts' from source: unknown 30575 1726867595.02957: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867595.03027: when evaluation is False, skipping this task 30575 1726867595.03346: _execute() done 30575 1726867595.03350: dumping result to json 30575 1726867595.03352: done dumping result, returning 30575 1726867595.03355: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000000b90] 30575 1726867595.03357: sending task result for task 0affcac9-a3a5-e081-a588-000000000b90 30575 1726867595.03432: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b90 30575 1726867595.03437: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867595.03496: no more pending results, returning what we have 30575 1726867595.03501: results queue empty 30575 1726867595.03502: checking for any_errors_fatal 30575 1726867595.03503: done checking for any_errors_fatal 30575 1726867595.03504: checking for max_fail_percentage 30575 1726867595.03506: done checking for max_fail_percentage 30575 1726867595.03507: checking to see if all hosts have failed and the running result is not ok 30575 1726867595.03508: done checking to see if all hosts have failed 30575 1726867595.03509: getting the remaining hosts for this loop 30575 1726867595.03510: done getting the remaining hosts for this loop 30575 1726867595.03514: getting the next task for host managed_node3 30575 1726867595.03532: done getting next task for host managed_node3 30575 1726867595.03536: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867595.03543: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867595.03567: getting variables 30575 1726867595.03569: in VariableManager get_vars() 30575 1726867595.03611: Calling all_inventory to load vars for managed_node3 30575 1726867595.03614: Calling groups_inventory to load vars for managed_node3 30575 1726867595.03617: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867595.03631: Calling all_plugins_play to load vars for managed_node3 30575 1726867595.03634: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867595.03644: Calling groups_plugins_play to load vars for managed_node3 30575 1726867595.07756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867595.11534: done with get_vars() 30575 1726867595.11559: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:26:35 -0400 (0:00:00.212) 0:00:30.497 ****** 30575 1726867595.11972: entering _queue_task() for managed_node3/stat 30575 1726867595.13151: worker is 1 (out of 1 available) 30575 1726867595.13166: exiting _queue_task() for managed_node3/stat 30575 1726867595.13383: done queuing things up, now waiting for results queue to drain 30575 1726867595.13385: waiting for pending results... 30575 1726867595.14038: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867595.14448: in run() - task 0affcac9-a3a5-e081-a588-000000000b92 30575 1726867595.14495: variable 'ansible_search_path' from source: unknown 30575 1726867595.14498: variable 'ansible_search_path' from source: unknown 30575 1726867595.14605: calling self._execute() 30575 1726867595.14926: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867595.14930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867595.15060: variable 'omit' from source: magic vars 30575 1726867595.16197: variable 'ansible_distribution_major_version' from source: facts 30575 1726867595.16210: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867595.16759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867595.17842: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867595.17934: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867595.18021: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867595.18058: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867595.18430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867595.18576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867595.18604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867595.18746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867595.19128: variable '__network_is_ostree' from source: set_fact 30575 1726867595.19132: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867595.19134: when evaluation is False, skipping this task 30575 1726867595.19139: _execute() done 30575 1726867595.19147: dumping result to json 30575 1726867595.19150: done dumping result, returning 30575 1726867595.19182: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000000b92] 30575 1726867595.19185: sending task result for task 0affcac9-a3a5-e081-a588-000000000b92 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867595.19539: no more pending results, returning what we have 30575 1726867595.19544: results queue empty 30575 1726867595.19544: checking for any_errors_fatal 30575 1726867595.19550: done checking for any_errors_fatal 30575 1726867595.19551: checking for max_fail_percentage 30575 1726867595.19553: done checking for max_fail_percentage 30575 1726867595.19554: checking to see if all hosts have failed and the running result is not ok 30575 1726867595.19555: done checking to see if all hosts have failed 30575 1726867595.19555: getting the remaining hosts for this loop 30575 1726867595.19557: done getting the remaining hosts for this loop 30575 1726867595.19561: getting the next task for host managed_node3 30575 1726867595.19568: done getting next task for host managed_node3 30575 1726867595.19572: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867595.19580: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867595.19603: getting variables 30575 1726867595.19605: in VariableManager get_vars() 30575 1726867595.19648: Calling all_inventory to load vars for managed_node3 30575 1726867595.19651: Calling groups_inventory to load vars for managed_node3 30575 1726867595.19653: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867595.19664: Calling all_plugins_play to load vars for managed_node3 30575 1726867595.19668: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867595.19672: Calling groups_plugins_play to load vars for managed_node3 30575 1726867595.20637: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b92 30575 1726867595.20642: WORKER PROCESS EXITING 30575 1726867595.23513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867595.28159: done with get_vars() 30575 1726867595.28246: done getting variables 30575 1726867595.28312: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:26:35 -0400 (0:00:00.164) 0:00:30.662 ****** 30575 1726867595.28475: entering _queue_task() for managed_node3/set_fact 30575 1726867595.29549: worker is 1 (out of 1 available) 30575 1726867595.29563: exiting _queue_task() for managed_node3/set_fact 30575 1726867595.29580: done queuing things up, now waiting for results queue to drain 30575 1726867595.29582: waiting for pending results... 30575 1726867595.30231: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867595.30572: in run() - task 0affcac9-a3a5-e081-a588-000000000b93 30575 1726867595.30661: variable 'ansible_search_path' from source: unknown 30575 1726867595.30670: variable 'ansible_search_path' from source: unknown 30575 1726867595.30858: calling self._execute() 30575 1726867595.31005: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867595.31038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867595.31089: variable 'omit' from source: magic vars 30575 1726867595.31941: variable 'ansible_distribution_major_version' from source: facts 30575 1726867595.31995: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867595.32494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867595.33038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867595.33093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867595.33179: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867595.33290: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867595.33496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867595.33527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867595.33558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867595.33722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867595.33861: variable '__network_is_ostree' from source: set_fact 30575 1726867595.33873: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867595.33886: when evaluation is False, skipping this task 30575 1726867595.33917: _execute() done 30575 1726867595.33926: dumping result to json 30575 1726867595.33936: done dumping result, returning 30575 1726867595.33958: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000000b93] 30575 1726867595.34029: sending task result for task 0affcac9-a3a5-e081-a588-000000000b93 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867595.34351: no more pending results, returning what we have 30575 1726867595.34356: results queue empty 30575 1726867595.34357: checking for any_errors_fatal 30575 1726867595.34365: done checking for any_errors_fatal 30575 1726867595.34366: checking for max_fail_percentage 30575 1726867595.34369: done checking for max_fail_percentage 30575 1726867595.34370: checking to see if all hosts have failed and the running result is not ok 30575 1726867595.34371: done checking to see if all hosts have failed 30575 1726867595.34372: getting the remaining hosts for this loop 30575 1726867595.34374: done getting the remaining hosts for this loop 30575 1726867595.34380: getting the next task for host managed_node3 30575 1726867595.34395: done getting next task for host managed_node3 30575 1726867595.34402: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867595.34409: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867595.34440: getting variables 30575 1726867595.34442: in VariableManager get_vars() 30575 1726867595.34793: Calling all_inventory to load vars for managed_node3 30575 1726867595.34796: Calling groups_inventory to load vars for managed_node3 30575 1726867595.34799: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867595.34809: Calling all_plugins_play to load vars for managed_node3 30575 1726867595.34813: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867595.34816: Calling groups_plugins_play to load vars for managed_node3 30575 1726867595.35408: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b93 30575 1726867595.35411: WORKER PROCESS EXITING 30575 1726867595.38084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867595.42269: done with get_vars() 30575 1726867595.42359: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:26:35 -0400 (0:00:00.142) 0:00:30.804 ****** 30575 1726867595.42690: entering _queue_task() for managed_node3/service_facts 30575 1726867595.43581: worker is 1 (out of 1 available) 30575 1726867595.43595: exiting _queue_task() for managed_node3/service_facts 30575 1726867595.43720: done queuing things up, now waiting for results queue to drain 30575 1726867595.43722: waiting for pending results... 30575 1726867595.44247: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867595.44568: in run() - task 0affcac9-a3a5-e081-a588-000000000b95 30575 1726867595.44598: variable 'ansible_search_path' from source: unknown 30575 1726867595.44607: variable 'ansible_search_path' from source: unknown 30575 1726867595.44694: calling self._execute() 30575 1726867595.44858: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867595.44922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867595.44941: variable 'omit' from source: magic vars 30575 1726867595.45804: variable 'ansible_distribution_major_version' from source: facts 30575 1726867595.46004: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867595.46007: variable 'omit' from source: magic vars 30575 1726867595.46183: variable 'omit' from source: magic vars 30575 1726867595.46186: variable 'omit' from source: magic vars 30575 1726867595.46244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867595.46319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867595.46554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867595.46561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867595.46565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867595.46567: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867595.46570: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867595.46572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867595.46892: Set connection var ansible_pipelining to False 30575 1726867595.46903: Set connection var ansible_shell_type to sh 30575 1726867595.46915: Set connection var ansible_shell_executable to /bin/sh 30575 1726867595.47085: Set connection var ansible_timeout to 10 30575 1726867595.47089: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867595.47091: Set connection var ansible_connection to ssh 30575 1726867595.47093: variable 'ansible_shell_executable' from source: unknown 30575 1726867595.47098: variable 'ansible_connection' from source: unknown 30575 1726867595.47101: variable 'ansible_module_compression' from source: unknown 30575 1726867595.47103: variable 'ansible_shell_type' from source: unknown 30575 1726867595.47105: variable 'ansible_shell_executable' from source: unknown 30575 1726867595.47106: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867595.47108: variable 'ansible_pipelining' from source: unknown 30575 1726867595.47110: variable 'ansible_timeout' from source: unknown 30575 1726867595.47112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867595.47666: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867595.47672: variable 'omit' from source: magic vars 30575 1726867595.47675: starting attempt loop 30575 1726867595.47678: running the handler 30575 1726867595.47681: _low_level_execute_command(): starting 30575 1726867595.47690: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867595.49129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867595.49144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867595.49170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867595.49204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867595.49294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867595.49349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867595.49367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867595.49432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867595.49565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867595.51279: stdout chunk (state=3): >>>/root <<< 30575 1726867595.51553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867595.51557: stdout chunk (state=3): >>><<< 30575 1726867595.51560: stderr chunk (state=3): >>><<< 30575 1726867595.51789: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867595.51793: _low_level_execute_command(): starting 30575 1726867595.51797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294 `" && echo ansible-tmp-1726867595.5158641-32039-172220120957294="` echo /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294 `" ) && sleep 0' 30575 1726867595.53599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867595.53667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867595.54055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867595.54093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867595.54203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867595.56046: stdout chunk (state=3): >>>ansible-tmp-1726867595.5158641-32039-172220120957294=/root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294 <<< 30575 1726867595.56160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867595.56299: stderr chunk (state=3): >>><<< 30575 1726867595.56303: stdout chunk (state=3): >>><<< 30575 1726867595.56406: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867595.5158641-32039-172220120957294=/root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867595.56585: variable 'ansible_module_compression' from source: unknown 30575 1726867595.56588: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867595.56590: variable 'ansible_facts' from source: unknown 30575 1726867595.57349: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/AnsiballZ_service_facts.py 30575 1726867595.57827: Sending initial data 30575 1726867595.57830: Sent initial data (162 bytes) 30575 1726867595.58896: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867595.58904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867595.59010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867595.59103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867595.59114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867595.59195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867595.59215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867595.60899: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/AnsiballZ_service_facts.py" <<< 30575 1726867595.60903: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4drw02ko /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/AnsiballZ_service_facts.py <<< 30575 1726867595.60929: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4drw02ko" to remote "/root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/AnsiballZ_service_facts.py" <<< 30575 1726867595.62612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867595.62616: stdout chunk (state=3): >>><<< 30575 1726867595.62618: stderr chunk (state=3): >>><<< 30575 1726867595.62620: done transferring module to remote 30575 1726867595.62622: _low_level_execute_command(): starting 30575 1726867595.62627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/ /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/AnsiballZ_service_facts.py && sleep 0' 30575 1726867595.63780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867595.63791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867595.63804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867595.64034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867595.64037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867595.64040: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867595.64042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867595.64044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867595.64046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867595.64048: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867595.64050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867595.64052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867595.64054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867595.64055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867595.64057: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867595.64063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867595.64228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867595.64291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867595.66183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867595.66186: stdout chunk (state=3): >>><<< 30575 1726867595.66188: stderr chunk (state=3): >>><<< 30575 1726867595.66191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867595.66193: _low_level_execute_command(): starting 30575 1726867595.66195: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/AnsiballZ_service_facts.py && sleep 0' 30575 1726867595.67127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867595.67376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867595.67393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867595.67415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867595.67481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867597.19308: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30575 1726867597.19326: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 30575 1726867597.19431: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867597.20969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867597.21016: stderr chunk (state=3): >>><<< 30575 1726867597.21019: stdout chunk (state=3): >>><<< 30575 1726867597.21069: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867597.21729: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867597.21793: _low_level_execute_command(): starting 30575 1726867597.21796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867595.5158641-32039-172220120957294/ > /dev/null 2>&1 && sleep 0' 30575 1726867597.22453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867597.22473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.22496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867597.22499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.22548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867597.22560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867597.22620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867597.24511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867597.24514: stdout chunk (state=3): >>><<< 30575 1726867597.24517: stderr chunk (state=3): >>><<< 30575 1726867597.24539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867597.24543: handler run complete 30575 1726867597.24699: variable 'ansible_facts' from source: unknown 30575 1726867597.24811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867597.25113: variable 'ansible_facts' from source: unknown 30575 1726867597.25197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867597.25317: attempt loop complete, returning result 30575 1726867597.25320: _execute() done 30575 1726867597.25323: dumping result to json 30575 1726867597.25367: done dumping result, returning 30575 1726867597.25374: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000000b95] 30575 1726867597.25380: sending task result for task 0affcac9-a3a5-e081-a588-000000000b95 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867597.26087: no more pending results, returning what we have 30575 1726867597.26090: results queue empty 30575 1726867597.26091: checking for any_errors_fatal 30575 1726867597.26099: done checking for any_errors_fatal 30575 1726867597.26101: checking for max_fail_percentage 30575 1726867597.26102: done checking for max_fail_percentage 30575 1726867597.26103: checking to see if all hosts have failed and the running result is not ok 30575 1726867597.26104: done checking to see if all hosts have failed 30575 1726867597.26104: getting the remaining hosts for this loop 30575 1726867597.26105: done getting the remaining hosts for this loop 30575 1726867597.26108: getting the next task for host managed_node3 30575 1726867597.26115: done getting next task for host managed_node3 30575 1726867597.26118: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867597.26130: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867597.26138: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b95 30575 1726867597.26141: WORKER PROCESS EXITING 30575 1726867597.26149: getting variables 30575 1726867597.26150: in VariableManager get_vars() 30575 1726867597.26182: Calling all_inventory to load vars for managed_node3 30575 1726867597.26185: Calling groups_inventory to load vars for managed_node3 30575 1726867597.26187: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867597.26196: Calling all_plugins_play to load vars for managed_node3 30575 1726867597.26199: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867597.26202: Calling groups_plugins_play to load vars for managed_node3 30575 1726867597.27740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867597.29338: done with get_vars() 30575 1726867597.29360: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:26:37 -0400 (0:00:01.867) 0:00:32.672 ****** 30575 1726867597.29470: entering _queue_task() for managed_node3/package_facts 30575 1726867597.29782: worker is 1 (out of 1 available) 30575 1726867597.29796: exiting _queue_task() for managed_node3/package_facts 30575 1726867597.29809: done queuing things up, now waiting for results queue to drain 30575 1726867597.29811: waiting for pending results... 30575 1726867597.30118: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867597.30385: in run() - task 0affcac9-a3a5-e081-a588-000000000b96 30575 1726867597.30389: variable 'ansible_search_path' from source: unknown 30575 1726867597.30393: variable 'ansible_search_path' from source: unknown 30575 1726867597.30416: calling self._execute() 30575 1726867597.30528: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867597.30532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867597.30555: variable 'omit' from source: magic vars 30575 1726867597.30913: variable 'ansible_distribution_major_version' from source: facts 30575 1726867597.30928: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867597.30931: variable 'omit' from source: magic vars 30575 1726867597.31005: variable 'omit' from source: magic vars 30575 1726867597.31029: variable 'omit' from source: magic vars 30575 1726867597.31067: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867597.31155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867597.31161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867597.31170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867597.31195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867597.31216: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867597.31233: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867597.31239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867597.31355: Set connection var ansible_pipelining to False 30575 1726867597.31358: Set connection var ansible_shell_type to sh 30575 1726867597.31409: Set connection var ansible_shell_executable to /bin/sh 30575 1726867597.31416: Set connection var ansible_timeout to 10 30575 1726867597.31419: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867597.31421: Set connection var ansible_connection to ssh 30575 1726867597.31524: variable 'ansible_shell_executable' from source: unknown 30575 1726867597.31528: variable 'ansible_connection' from source: unknown 30575 1726867597.31531: variable 'ansible_module_compression' from source: unknown 30575 1726867597.31533: variable 'ansible_shell_type' from source: unknown 30575 1726867597.31535: variable 'ansible_shell_executable' from source: unknown 30575 1726867597.31537: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867597.31539: variable 'ansible_pipelining' from source: unknown 30575 1726867597.31541: variable 'ansible_timeout' from source: unknown 30575 1726867597.31543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867597.31700: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867597.31713: variable 'omit' from source: magic vars 30575 1726867597.31770: starting attempt loop 30575 1726867597.31773: running the handler 30575 1726867597.31776: _low_level_execute_command(): starting 30575 1726867597.31781: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867597.32786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.32806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867597.32828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867597.32963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867597.32968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867597.34650: stdout chunk (state=3): >>>/root <<< 30575 1726867597.34883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867597.34887: stderr chunk (state=3): >>><<< 30575 1726867597.34889: stdout chunk (state=3): >>><<< 30575 1726867597.34899: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867597.34913: _low_level_execute_command(): starting 30575 1726867597.34921: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066 `" && echo ansible-tmp-1726867597.3489907-32114-25926891626066="` echo /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066 `" ) && sleep 0' 30575 1726867597.35513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867597.35531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867597.35555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.35594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.35689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867597.35807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867597.37701: stdout chunk (state=3): >>>ansible-tmp-1726867597.3489907-32114-25926891626066=/root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066 <<< 30575 1726867597.37857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867597.37861: stdout chunk (state=3): >>><<< 30575 1726867597.37863: stderr chunk (state=3): >>><<< 30575 1726867597.37985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867597.3489907-32114-25926891626066=/root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867597.37989: variable 'ansible_module_compression' from source: unknown 30575 1726867597.38010: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867597.38084: variable 'ansible_facts' from source: unknown 30575 1726867597.38350: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/AnsiballZ_package_facts.py 30575 1726867597.38599: Sending initial data 30575 1726867597.38619: Sent initial data (161 bytes) 30575 1726867597.39107: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867597.39114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867597.39127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867597.39138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867597.39149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867597.39155: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867597.39163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.39175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867597.39185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867597.39191: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867597.39204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867597.39220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867597.39296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.39341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867597.39344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867597.39380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867597.39419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867597.41024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867597.41131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867597.41225: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp10v2hll3 /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/AnsiballZ_package_facts.py <<< 30575 1726867597.41228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/AnsiballZ_package_facts.py" <<< 30575 1726867597.41327: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp10v2hll3" to remote "/root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/AnsiballZ_package_facts.py" <<< 30575 1726867597.43316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867597.43399: stderr chunk (state=3): >>><<< 30575 1726867597.43483: stdout chunk (state=3): >>><<< 30575 1726867597.43486: done transferring module to remote 30575 1726867597.43488: _low_level_execute_command(): starting 30575 1726867597.43491: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/ /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/AnsiballZ_package_facts.py && sleep 0' 30575 1726867597.44253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867597.44292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867597.44308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867597.44319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867597.44427: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.44481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867597.44499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867597.44521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867597.44603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867597.46412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867597.46438: stderr chunk (state=3): >>><<< 30575 1726867597.46442: stdout chunk (state=3): >>><<< 30575 1726867597.46453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867597.46456: _low_level_execute_command(): starting 30575 1726867597.46461: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/AnsiballZ_package_facts.py && sleep 0' 30575 1726867597.46960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867597.46986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867597.47007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867597.47059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867597.47062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867597.47118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867597.91691: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud<<< 30575 1726867597.91714: stdout chunk (state=3): >>>-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867597.93057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867597.93062: stdout chunk (state=3): >>><<< 30575 1726867597.93080: stderr chunk (state=3): >>><<< 30575 1726867597.93217: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867597.98203: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867597.98229: _low_level_execute_command(): starting 30575 1726867597.98233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867597.3489907-32114-25926891626066/ > /dev/null 2>&1 && sleep 0' 30575 1726867597.99368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867597.99463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867597.99689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867598.01498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867598.01502: stdout chunk (state=3): >>><<< 30575 1726867598.01508: stderr chunk (state=3): >>><<< 30575 1726867598.01527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867598.01530: handler run complete 30575 1726867598.03082: variable 'ansible_facts' from source: unknown 30575 1726867598.03754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.07243: variable 'ansible_facts' from source: unknown 30575 1726867598.07707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.08583: attempt loop complete, returning result 30575 1726867598.08586: _execute() done 30575 1726867598.08588: dumping result to json 30575 1726867598.09032: done dumping result, returning 30575 1726867598.09035: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000000b96] 30575 1726867598.09038: sending task result for task 0affcac9-a3a5-e081-a588-000000000b96 30575 1726867598.13922: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b96 30575 1726867598.13930: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867598.14281: no more pending results, returning what we have 30575 1726867598.14291: results queue empty 30575 1726867598.14292: checking for any_errors_fatal 30575 1726867598.14298: done checking for any_errors_fatal 30575 1726867598.14298: checking for max_fail_percentage 30575 1726867598.14300: done checking for max_fail_percentage 30575 1726867598.14301: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.14302: done checking to see if all hosts have failed 30575 1726867598.14302: getting the remaining hosts for this loop 30575 1726867598.14304: done getting the remaining hosts for this loop 30575 1726867598.14307: getting the next task for host managed_node3 30575 1726867598.14315: done getting next task for host managed_node3 30575 1726867598.14318: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867598.14325: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.14338: getting variables 30575 1726867598.14339: in VariableManager get_vars() 30575 1726867598.14371: Calling all_inventory to load vars for managed_node3 30575 1726867598.14374: Calling groups_inventory to load vars for managed_node3 30575 1726867598.14376: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.14509: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.14515: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.14519: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.16653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.18617: done with get_vars() 30575 1726867598.18646: done getting variables 30575 1726867598.18746: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:26:38 -0400 (0:00:00.893) 0:00:33.565 ****** 30575 1726867598.18803: entering _queue_task() for managed_node3/debug 30575 1726867598.19352: worker is 1 (out of 1 available) 30575 1726867598.19368: exiting _queue_task() for managed_node3/debug 30575 1726867598.19385: done queuing things up, now waiting for results queue to drain 30575 1726867598.19387: waiting for pending results... 30575 1726867598.19775: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867598.19871: in run() - task 0affcac9-a3a5-e081-a588-000000000b34 30575 1726867598.19875: variable 'ansible_search_path' from source: unknown 30575 1726867598.19879: variable 'ansible_search_path' from source: unknown 30575 1726867598.19906: calling self._execute() 30575 1726867598.20030: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.20042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.20063: variable 'omit' from source: magic vars 30575 1726867598.20758: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.20800: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.20865: variable 'omit' from source: magic vars 30575 1726867598.20914: variable 'omit' from source: magic vars 30575 1726867598.21085: variable 'network_provider' from source: set_fact 30575 1726867598.21193: variable 'omit' from source: magic vars 30575 1726867598.21226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867598.21305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867598.21350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867598.21424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867598.21469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867598.21532: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867598.21536: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.21584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.21675: Set connection var ansible_pipelining to False 30575 1726867598.21695: Set connection var ansible_shell_type to sh 30575 1726867598.21713: Set connection var ansible_shell_executable to /bin/sh 30575 1726867598.21734: Set connection var ansible_timeout to 10 30575 1726867598.21799: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867598.21803: Set connection var ansible_connection to ssh 30575 1726867598.21818: variable 'ansible_shell_executable' from source: unknown 30575 1726867598.21837: variable 'ansible_connection' from source: unknown 30575 1726867598.21852: variable 'ansible_module_compression' from source: unknown 30575 1726867598.21920: variable 'ansible_shell_type' from source: unknown 30575 1726867598.21925: variable 'ansible_shell_executable' from source: unknown 30575 1726867598.21928: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.21930: variable 'ansible_pipelining' from source: unknown 30575 1726867598.21932: variable 'ansible_timeout' from source: unknown 30575 1726867598.21934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.22190: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867598.22208: variable 'omit' from source: magic vars 30575 1726867598.22217: starting attempt loop 30575 1726867598.22223: running the handler 30575 1726867598.22297: handler run complete 30575 1726867598.22316: attempt loop complete, returning result 30575 1726867598.22328: _execute() done 30575 1726867598.22354: dumping result to json 30575 1726867598.22359: done dumping result, returning 30575 1726867598.22394: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000000b34] 30575 1726867598.22398: sending task result for task 0affcac9-a3a5-e081-a588-000000000b34 30575 1726867598.22573: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b34 30575 1726867598.22694: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867598.22780: no more pending results, returning what we have 30575 1726867598.22784: results queue empty 30575 1726867598.22785: checking for any_errors_fatal 30575 1726867598.22797: done checking for any_errors_fatal 30575 1726867598.22798: checking for max_fail_percentage 30575 1726867598.22801: done checking for max_fail_percentage 30575 1726867598.22803: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.22804: done checking to see if all hosts have failed 30575 1726867598.22806: getting the remaining hosts for this loop 30575 1726867598.22808: done getting the remaining hosts for this loop 30575 1726867598.22812: getting the next task for host managed_node3 30575 1726867598.22822: done getting next task for host managed_node3 30575 1726867598.22827: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867598.22833: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.22846: getting variables 30575 1726867598.22848: in VariableManager get_vars() 30575 1726867598.23114: Calling all_inventory to load vars for managed_node3 30575 1726867598.23121: Calling groups_inventory to load vars for managed_node3 30575 1726867598.23132: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.23141: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.23144: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.23147: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.24805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.26850: done with get_vars() 30575 1726867598.26879: done getting variables 30575 1726867598.26982: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:26:38 -0400 (0:00:00.082) 0:00:33.647 ****** 30575 1726867598.27031: entering _queue_task() for managed_node3/fail 30575 1726867598.27439: worker is 1 (out of 1 available) 30575 1726867598.27452: exiting _queue_task() for managed_node3/fail 30575 1726867598.27466: done queuing things up, now waiting for results queue to drain 30575 1726867598.27468: waiting for pending results... 30575 1726867598.27793: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867598.27939: in run() - task 0affcac9-a3a5-e081-a588-000000000b35 30575 1726867598.28082: variable 'ansible_search_path' from source: unknown 30575 1726867598.28090: variable 'ansible_search_path' from source: unknown 30575 1726867598.28094: calling self._execute() 30575 1726867598.28140: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.28153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.28175: variable 'omit' from source: magic vars 30575 1726867598.28599: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.28608: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.28720: variable 'network_state' from source: role '' defaults 30575 1726867598.28729: Evaluated conditional (network_state != {}): False 30575 1726867598.28736: when evaluation is False, skipping this task 30575 1726867598.28742: _execute() done 30575 1726867598.28746: dumping result to json 30575 1726867598.28749: done dumping result, returning 30575 1726867598.28752: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-000000000b35] 30575 1726867598.28755: sending task result for task 0affcac9-a3a5-e081-a588-000000000b35 30575 1726867598.28881: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b35 30575 1726867598.28884: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867598.28963: no more pending results, returning what we have 30575 1726867598.28966: results queue empty 30575 1726867598.28967: checking for any_errors_fatal 30575 1726867598.28976: done checking for any_errors_fatal 30575 1726867598.28978: checking for max_fail_percentage 30575 1726867598.28980: done checking for max_fail_percentage 30575 1726867598.28981: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.28983: done checking to see if all hosts have failed 30575 1726867598.28984: getting the remaining hosts for this loop 30575 1726867598.28986: done getting the remaining hosts for this loop 30575 1726867598.28989: getting the next task for host managed_node3 30575 1726867598.28997: done getting next task for host managed_node3 30575 1726867598.29002: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867598.29007: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.29030: getting variables 30575 1726867598.29032: in VariableManager get_vars() 30575 1726867598.29061: Calling all_inventory to load vars for managed_node3 30575 1726867598.29065: Calling groups_inventory to load vars for managed_node3 30575 1726867598.29067: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.29076: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.29140: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.29145: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.30284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.37688: done with get_vars() 30575 1726867598.37706: done getting variables 30575 1726867598.37770: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:26:38 -0400 (0:00:00.107) 0:00:33.755 ****** 30575 1726867598.37811: entering _queue_task() for managed_node3/fail 30575 1726867598.38238: worker is 1 (out of 1 available) 30575 1726867598.38252: exiting _queue_task() for managed_node3/fail 30575 1726867598.38267: done queuing things up, now waiting for results queue to drain 30575 1726867598.38271: waiting for pending results... 30575 1726867598.38619: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867598.38734: in run() - task 0affcac9-a3a5-e081-a588-000000000b36 30575 1726867598.38747: variable 'ansible_search_path' from source: unknown 30575 1726867598.38755: variable 'ansible_search_path' from source: unknown 30575 1726867598.38869: calling self._execute() 30575 1726867598.38906: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.38928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.38936: variable 'omit' from source: magic vars 30575 1726867598.39353: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.39361: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.39545: variable 'network_state' from source: role '' defaults 30575 1726867598.39551: Evaluated conditional (network_state != {}): False 30575 1726867598.39556: when evaluation is False, skipping this task 30575 1726867598.39560: _execute() done 30575 1726867598.39562: dumping result to json 30575 1726867598.39564: done dumping result, returning 30575 1726867598.39567: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-000000000b36] 30575 1726867598.39569: sending task result for task 0affcac9-a3a5-e081-a588-000000000b36 30575 1726867598.39811: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b36 30575 1726867598.39813: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867598.39927: no more pending results, returning what we have 30575 1726867598.39930: results queue empty 30575 1726867598.39931: checking for any_errors_fatal 30575 1726867598.39939: done checking for any_errors_fatal 30575 1726867598.39941: checking for max_fail_percentage 30575 1726867598.39947: done checking for max_fail_percentage 30575 1726867598.39950: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.39951: done checking to see if all hosts have failed 30575 1726867598.39952: getting the remaining hosts for this loop 30575 1726867598.39953: done getting the remaining hosts for this loop 30575 1726867598.39957: getting the next task for host managed_node3 30575 1726867598.39972: done getting next task for host managed_node3 30575 1726867598.39978: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867598.39985: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.40020: getting variables 30575 1726867598.40022: in VariableManager get_vars() 30575 1726867598.40065: Calling all_inventory to load vars for managed_node3 30575 1726867598.40068: Calling groups_inventory to load vars for managed_node3 30575 1726867598.40071: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.40129: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.40138: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.40143: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.41392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.42732: done with get_vars() 30575 1726867598.42761: done getting variables 30575 1726867598.42831: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:26:38 -0400 (0:00:00.050) 0:00:33.806 ****** 30575 1726867598.42873: entering _queue_task() for managed_node3/fail 30575 1726867598.43235: worker is 1 (out of 1 available) 30575 1726867598.43250: exiting _queue_task() for managed_node3/fail 30575 1726867598.43265: done queuing things up, now waiting for results queue to drain 30575 1726867598.43267: waiting for pending results... 30575 1726867598.43834: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867598.43966: in run() - task 0affcac9-a3a5-e081-a588-000000000b37 30575 1726867598.43971: variable 'ansible_search_path' from source: unknown 30575 1726867598.43975: variable 'ansible_search_path' from source: unknown 30575 1726867598.43980: calling self._execute() 30575 1726867598.44137: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.44144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.44148: variable 'omit' from source: magic vars 30575 1726867598.44751: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.44759: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.44975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867598.47604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867598.47657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867598.47688: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867598.47715: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867598.47739: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867598.47796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.47818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.47838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.47867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.47879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.47949: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.47965: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867598.48041: variable 'ansible_distribution' from source: facts 30575 1726867598.48045: variable '__network_rh_distros' from source: role '' defaults 30575 1726867598.48053: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867598.48226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.48244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.48264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.48292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.48302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.48336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.48352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.48371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.48399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.48408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.48440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.48455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.48476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.48500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.48513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.48710: variable 'network_connections' from source: include params 30575 1726867598.48720: variable 'interface' from source: play vars 30575 1726867598.48767: variable 'interface' from source: play vars 30575 1726867598.48780: variable 'network_state' from source: role '' defaults 30575 1726867598.48824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867598.49200: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867598.49229: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867598.49262: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867598.49308: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867598.49327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867598.49343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867598.49365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.49384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867598.49424: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867598.49427: when evaluation is False, skipping this task 30575 1726867598.49430: _execute() done 30575 1726867598.49433: dumping result to json 30575 1726867598.49435: done dumping result, returning 30575 1726867598.49448: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000000b37] 30575 1726867598.49451: sending task result for task 0affcac9-a3a5-e081-a588-000000000b37 30575 1726867598.49545: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b37 30575 1726867598.49552: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867598.49641: no more pending results, returning what we have 30575 1726867598.49644: results queue empty 30575 1726867598.49645: checking for any_errors_fatal 30575 1726867598.49650: done checking for any_errors_fatal 30575 1726867598.49651: checking for max_fail_percentage 30575 1726867598.49652: done checking for max_fail_percentage 30575 1726867598.49653: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.49654: done checking to see if all hosts have failed 30575 1726867598.49655: getting the remaining hosts for this loop 30575 1726867598.49656: done getting the remaining hosts for this loop 30575 1726867598.49659: getting the next task for host managed_node3 30575 1726867598.49666: done getting next task for host managed_node3 30575 1726867598.49671: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867598.49675: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.49698: getting variables 30575 1726867598.49699: in VariableManager get_vars() 30575 1726867598.49731: Calling all_inventory to load vars for managed_node3 30575 1726867598.49734: Calling groups_inventory to load vars for managed_node3 30575 1726867598.49736: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.49744: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.49746: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.49749: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.50786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.51853: done with get_vars() 30575 1726867598.51867: done getting variables 30575 1726867598.51910: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:26:38 -0400 (0:00:00.090) 0:00:33.896 ****** 30575 1726867598.51934: entering _queue_task() for managed_node3/dnf 30575 1726867598.52160: worker is 1 (out of 1 available) 30575 1726867598.52173: exiting _queue_task() for managed_node3/dnf 30575 1726867598.52187: done queuing things up, now waiting for results queue to drain 30575 1726867598.52189: waiting for pending results... 30575 1726867598.52383: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867598.52476: in run() - task 0affcac9-a3a5-e081-a588-000000000b38 30575 1726867598.52489: variable 'ansible_search_path' from source: unknown 30575 1726867598.52492: variable 'ansible_search_path' from source: unknown 30575 1726867598.52527: calling self._execute() 30575 1726867598.52594: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.52598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.52607: variable 'omit' from source: magic vars 30575 1726867598.52933: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.52943: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.53154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867598.54995: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867598.55202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867598.55206: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867598.55208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867598.55242: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867598.55384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.55435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.55492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.55593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.55596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.55661: variable 'ansible_distribution' from source: facts 30575 1726867598.55665: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.55701: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867598.55779: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867598.55881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.55898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.55915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.55941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.55951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.55983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.56000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.56016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.56041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.56051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.56081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.56099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.56115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.56140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.56151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.56255: variable 'network_connections' from source: include params 30575 1726867598.56263: variable 'interface' from source: play vars 30575 1726867598.56327: variable 'interface' from source: play vars 30575 1726867598.56375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867598.56496: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867598.56523: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867598.56550: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867598.56572: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867598.56603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867598.56618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867598.56645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.56663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867598.56706: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867598.56863: variable 'network_connections' from source: include params 30575 1726867598.56867: variable 'interface' from source: play vars 30575 1726867598.56907: variable 'interface' from source: play vars 30575 1726867598.56931: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867598.56934: when evaluation is False, skipping this task 30575 1726867598.56937: _execute() done 30575 1726867598.56939: dumping result to json 30575 1726867598.56941: done dumping result, returning 30575 1726867598.56949: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000b38] 30575 1726867598.56954: sending task result for task 0affcac9-a3a5-e081-a588-000000000b38 30575 1726867598.57041: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b38 30575 1726867598.57045: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867598.57107: no more pending results, returning what we have 30575 1726867598.57110: results queue empty 30575 1726867598.57111: checking for any_errors_fatal 30575 1726867598.57117: done checking for any_errors_fatal 30575 1726867598.57117: checking for max_fail_percentage 30575 1726867598.57119: done checking for max_fail_percentage 30575 1726867598.57120: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.57121: done checking to see if all hosts have failed 30575 1726867598.57121: getting the remaining hosts for this loop 30575 1726867598.57123: done getting the remaining hosts for this loop 30575 1726867598.57127: getting the next task for host managed_node3 30575 1726867598.57135: done getting next task for host managed_node3 30575 1726867598.57138: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867598.57143: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.57162: getting variables 30575 1726867598.57163: in VariableManager get_vars() 30575 1726867598.57213: Calling all_inventory to load vars for managed_node3 30575 1726867598.57216: Calling groups_inventory to load vars for managed_node3 30575 1726867598.57219: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.57228: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.57231: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.57234: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.58198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.59648: done with get_vars() 30575 1726867598.59678: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867598.59748: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:26:38 -0400 (0:00:00.078) 0:00:33.975 ****** 30575 1726867598.59772: entering _queue_task() for managed_node3/yum 30575 1726867598.60122: worker is 1 (out of 1 available) 30575 1726867598.60141: exiting _queue_task() for managed_node3/yum 30575 1726867598.60161: done queuing things up, now waiting for results queue to drain 30575 1726867598.60163: waiting for pending results... 30575 1726867598.60439: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867598.60704: in run() - task 0affcac9-a3a5-e081-a588-000000000b39 30575 1726867598.60708: variable 'ansible_search_path' from source: unknown 30575 1726867598.60710: variable 'ansible_search_path' from source: unknown 30575 1726867598.60713: calling self._execute() 30575 1726867598.60761: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.60765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.60780: variable 'omit' from source: magic vars 30575 1726867598.61148: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.61157: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.61278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867598.64408: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867598.64462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867598.64508: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867598.64684: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867598.64690: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867598.64695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.64744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.64785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.64830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.64849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.64961: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.64974: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867598.64979: when evaluation is False, skipping this task 30575 1726867598.64982: _execute() done 30575 1726867598.64984: dumping result to json 30575 1726867598.64986: done dumping result, returning 30575 1726867598.64995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000b39] 30575 1726867598.65000: sending task result for task 0affcac9-a3a5-e081-a588-000000000b39 30575 1726867598.65120: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b39 30575 1726867598.65123: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867598.65186: no more pending results, returning what we have 30575 1726867598.65190: results queue empty 30575 1726867598.65190: checking for any_errors_fatal 30575 1726867598.65195: done checking for any_errors_fatal 30575 1726867598.65195: checking for max_fail_percentage 30575 1726867598.65197: done checking for max_fail_percentage 30575 1726867598.65198: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.65199: done checking to see if all hosts have failed 30575 1726867598.65199: getting the remaining hosts for this loop 30575 1726867598.65201: done getting the remaining hosts for this loop 30575 1726867598.65204: getting the next task for host managed_node3 30575 1726867598.65211: done getting next task for host managed_node3 30575 1726867598.65214: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867598.65219: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.65238: getting variables 30575 1726867598.65239: in VariableManager get_vars() 30575 1726867598.65271: Calling all_inventory to load vars for managed_node3 30575 1726867598.65273: Calling groups_inventory to load vars for managed_node3 30575 1726867598.65275: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.65287: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.65292: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.65296: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.66476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.69503: done with get_vars() 30575 1726867598.69532: done getting variables 30575 1726867598.69594: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:26:38 -0400 (0:00:00.098) 0:00:34.073 ****** 30575 1726867598.69633: entering _queue_task() for managed_node3/fail 30575 1726867598.69958: worker is 1 (out of 1 available) 30575 1726867598.69971: exiting _queue_task() for managed_node3/fail 30575 1726867598.69984: done queuing things up, now waiting for results queue to drain 30575 1726867598.69986: waiting for pending results... 30575 1726867598.70429: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867598.70434: in run() - task 0affcac9-a3a5-e081-a588-000000000b3a 30575 1726867598.70436: variable 'ansible_search_path' from source: unknown 30575 1726867598.70439: variable 'ansible_search_path' from source: unknown 30575 1726867598.70484: calling self._execute() 30575 1726867598.70543: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.70547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.70557: variable 'omit' from source: magic vars 30575 1726867598.70973: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.71072: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.71137: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867598.71373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867598.74083: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867598.74171: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867598.74236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867598.74289: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867598.74330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867598.74413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.74484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.74492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.74539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.74568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.74628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.74682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.74701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.74768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.74785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.74913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.74985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.74989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.75035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.75056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.75333: variable 'network_connections' from source: include params 30575 1726867598.75355: variable 'interface' from source: play vars 30575 1726867598.75467: variable 'interface' from source: play vars 30575 1726867598.75536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867598.75719: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867598.75781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867598.75858: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867598.75861: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867598.75912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867598.75940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867598.75982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.76018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867598.76118: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867598.76341: variable 'network_connections' from source: include params 30575 1726867598.76344: variable 'interface' from source: play vars 30575 1726867598.76389: variable 'interface' from source: play vars 30575 1726867598.76414: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867598.76418: when evaluation is False, skipping this task 30575 1726867598.76421: _execute() done 30575 1726867598.76426: dumping result to json 30575 1726867598.76428: done dumping result, returning 30575 1726867598.76434: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000b3a] 30575 1726867598.76439: sending task result for task 0affcac9-a3a5-e081-a588-000000000b3a skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867598.76584: no more pending results, returning what we have 30575 1726867598.76587: results queue empty 30575 1726867598.76587: checking for any_errors_fatal 30575 1726867598.76594: done checking for any_errors_fatal 30575 1726867598.76594: checking for max_fail_percentage 30575 1726867598.76596: done checking for max_fail_percentage 30575 1726867598.76597: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.76598: done checking to see if all hosts have failed 30575 1726867598.76598: getting the remaining hosts for this loop 30575 1726867598.76600: done getting the remaining hosts for this loop 30575 1726867598.76603: getting the next task for host managed_node3 30575 1726867598.76614: done getting next task for host managed_node3 30575 1726867598.76617: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867598.76622: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.76644: getting variables 30575 1726867598.76645: in VariableManager get_vars() 30575 1726867598.76682: Calling all_inventory to load vars for managed_node3 30575 1726867598.76685: Calling groups_inventory to load vars for managed_node3 30575 1726867598.76687: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.76696: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.76698: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.76700: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.77290: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b3a 30575 1726867598.77294: WORKER PROCESS EXITING 30575 1726867598.77950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.79071: done with get_vars() 30575 1726867598.79094: done getting variables 30575 1726867598.79149: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:26:38 -0400 (0:00:00.095) 0:00:34.169 ****** 30575 1726867598.79184: entering _queue_task() for managed_node3/package 30575 1726867598.79461: worker is 1 (out of 1 available) 30575 1726867598.79474: exiting _queue_task() for managed_node3/package 30575 1726867598.79489: done queuing things up, now waiting for results queue to drain 30575 1726867598.79491: waiting for pending results... 30575 1726867598.79775: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867598.79866: in run() - task 0affcac9-a3a5-e081-a588-000000000b3b 30575 1726867598.79879: variable 'ansible_search_path' from source: unknown 30575 1726867598.79884: variable 'ansible_search_path' from source: unknown 30575 1726867598.79912: calling self._execute() 30575 1726867598.79982: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.79986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.80005: variable 'omit' from source: magic vars 30575 1726867598.80484: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.80487: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.80654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867598.80963: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867598.81009: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867598.81055: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867598.81431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867598.81519: variable 'network_packages' from source: role '' defaults 30575 1726867598.81589: variable '__network_provider_setup' from source: role '' defaults 30575 1726867598.81602: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867598.81648: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867598.81655: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867598.81700: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867598.81814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867598.84009: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867598.84012: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867598.84228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867598.84251: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867598.84278: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867598.84351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.84430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.84456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.84552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.84562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.84565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.84581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.84632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.84671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.84686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.85084: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867598.85095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.85126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.85156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.85202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.85219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.85315: variable 'ansible_python' from source: facts 30575 1726867598.85338: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867598.85438: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867598.85483: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867598.85704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.85708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.85711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.85713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.85729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.85772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867598.85797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867598.85839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.85878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867598.85892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867598.86037: variable 'network_connections' from source: include params 30575 1726867598.86041: variable 'interface' from source: play vars 30575 1726867598.86186: variable 'interface' from source: play vars 30575 1726867598.86213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867598.86237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867598.86282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867598.86314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867598.86486: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867598.86742: variable 'network_connections' from source: include params 30575 1726867598.86748: variable 'interface' from source: play vars 30575 1726867598.86967: variable 'interface' from source: play vars 30575 1726867598.87081: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867598.87266: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867598.87473: variable 'network_connections' from source: include params 30575 1726867598.87479: variable 'interface' from source: play vars 30575 1726867598.87562: variable 'interface' from source: play vars 30575 1726867598.87565: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867598.87636: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867598.88201: variable 'network_connections' from source: include params 30575 1726867598.88204: variable 'interface' from source: play vars 30575 1726867598.88326: variable 'interface' from source: play vars 30575 1726867598.88375: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867598.88632: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867598.88639: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867598.88644: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867598.89060: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867598.90188: variable 'network_connections' from source: include params 30575 1726867598.90192: variable 'interface' from source: play vars 30575 1726867598.90194: variable 'interface' from source: play vars 30575 1726867598.90211: variable 'ansible_distribution' from source: facts 30575 1726867598.90214: variable '__network_rh_distros' from source: role '' defaults 30575 1726867598.90217: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.90239: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867598.90771: variable 'ansible_distribution' from source: facts 30575 1726867598.90775: variable '__network_rh_distros' from source: role '' defaults 30575 1726867598.90780: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.90793: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867598.91190: variable 'ansible_distribution' from source: facts 30575 1726867598.91194: variable '__network_rh_distros' from source: role '' defaults 30575 1726867598.91196: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.91300: variable 'network_provider' from source: set_fact 30575 1726867598.91303: variable 'ansible_facts' from source: unknown 30575 1726867598.91904: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867598.91912: when evaluation is False, skipping this task 30575 1726867598.91920: _execute() done 30575 1726867598.91930: dumping result to json 30575 1726867598.91937: done dumping result, returning 30575 1726867598.91950: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000000b3b] 30575 1726867598.91962: sending task result for task 0affcac9-a3a5-e081-a588-000000000b3b skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867598.92144: no more pending results, returning what we have 30575 1726867598.92148: results queue empty 30575 1726867598.92149: checking for any_errors_fatal 30575 1726867598.92153: done checking for any_errors_fatal 30575 1726867598.92156: checking for max_fail_percentage 30575 1726867598.92158: done checking for max_fail_percentage 30575 1726867598.92159: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.92159: done checking to see if all hosts have failed 30575 1726867598.92160: getting the remaining hosts for this loop 30575 1726867598.92161: done getting the remaining hosts for this loop 30575 1726867598.92212: getting the next task for host managed_node3 30575 1726867598.92220: done getting next task for host managed_node3 30575 1726867598.92224: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867598.92229: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.92253: getting variables 30575 1726867598.92254: in VariableManager get_vars() 30575 1726867598.92405: Calling all_inventory to load vars for managed_node3 30575 1726867598.92408: Calling groups_inventory to load vars for managed_node3 30575 1726867598.92416: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867598.92501: Calling all_plugins_play to load vars for managed_node3 30575 1726867598.92505: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867598.92510: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b3b 30575 1726867598.92513: WORKER PROCESS EXITING 30575 1726867598.92517: Calling groups_plugins_play to load vars for managed_node3 30575 1726867598.95859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867598.97963: done with get_vars() 30575 1726867598.97995: done getting variables 30575 1726867598.98055: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:26:38 -0400 (0:00:00.189) 0:00:34.358 ****** 30575 1726867598.98094: entering _queue_task() for managed_node3/package 30575 1726867598.98697: worker is 1 (out of 1 available) 30575 1726867598.98705: exiting _queue_task() for managed_node3/package 30575 1726867598.98715: done queuing things up, now waiting for results queue to drain 30575 1726867598.98716: waiting for pending results... 30575 1726867598.98775: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867598.98935: in run() - task 0affcac9-a3a5-e081-a588-000000000b3c 30575 1726867598.98962: variable 'ansible_search_path' from source: unknown 30575 1726867598.98966: variable 'ansible_search_path' from source: unknown 30575 1726867598.99003: calling self._execute() 30575 1726867598.99107: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867598.99113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867598.99122: variable 'omit' from source: magic vars 30575 1726867598.99539: variable 'ansible_distribution_major_version' from source: facts 30575 1726867598.99783: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867598.99786: variable 'network_state' from source: role '' defaults 30575 1726867598.99788: Evaluated conditional (network_state != {}): False 30575 1726867598.99790: when evaluation is False, skipping this task 30575 1726867598.99792: _execute() done 30575 1726867598.99794: dumping result to json 30575 1726867598.99796: done dumping result, returning 30575 1726867598.99798: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000b3c] 30575 1726867598.99800: sending task result for task 0affcac9-a3a5-e081-a588-000000000b3c 30575 1726867598.99862: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b3c 30575 1726867598.99866: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867598.99912: no more pending results, returning what we have 30575 1726867598.99923: results queue empty 30575 1726867598.99924: checking for any_errors_fatal 30575 1726867598.99933: done checking for any_errors_fatal 30575 1726867598.99934: checking for max_fail_percentage 30575 1726867598.99935: done checking for max_fail_percentage 30575 1726867598.99936: checking to see if all hosts have failed and the running result is not ok 30575 1726867598.99937: done checking to see if all hosts have failed 30575 1726867598.99938: getting the remaining hosts for this loop 30575 1726867598.99939: done getting the remaining hosts for this loop 30575 1726867598.99943: getting the next task for host managed_node3 30575 1726867598.99953: done getting next task for host managed_node3 30575 1726867598.99957: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867598.99962: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867598.99988: getting variables 30575 1726867598.99990: in VariableManager get_vars() 30575 1726867599.00142: Calling all_inventory to load vars for managed_node3 30575 1726867599.00145: Calling groups_inventory to load vars for managed_node3 30575 1726867599.00148: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867599.00158: Calling all_plugins_play to load vars for managed_node3 30575 1726867599.00160: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867599.00163: Calling groups_plugins_play to load vars for managed_node3 30575 1726867599.02130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867599.04284: done with get_vars() 30575 1726867599.04304: done getting variables 30575 1726867599.04362: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:26:39 -0400 (0:00:00.063) 0:00:34.421 ****** 30575 1726867599.04405: entering _queue_task() for managed_node3/package 30575 1726867599.04876: worker is 1 (out of 1 available) 30575 1726867599.04950: exiting _queue_task() for managed_node3/package 30575 1726867599.04961: done queuing things up, now waiting for results queue to drain 30575 1726867599.04962: waiting for pending results... 30575 1726867599.05279: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867599.05438: in run() - task 0affcac9-a3a5-e081-a588-000000000b3d 30575 1726867599.05450: variable 'ansible_search_path' from source: unknown 30575 1726867599.05453: variable 'ansible_search_path' from source: unknown 30575 1726867599.05498: calling self._execute() 30575 1726867599.05608: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867599.05618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867599.05630: variable 'omit' from source: magic vars 30575 1726867599.06050: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.06061: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867599.06164: variable 'network_state' from source: role '' defaults 30575 1726867599.06176: Evaluated conditional (network_state != {}): False 30575 1726867599.06198: when evaluation is False, skipping this task 30575 1726867599.06202: _execute() done 30575 1726867599.06204: dumping result to json 30575 1726867599.06207: done dumping result, returning 30575 1726867599.06209: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000b3d] 30575 1726867599.06212: sending task result for task 0affcac9-a3a5-e081-a588-000000000b3d 30575 1726867599.06329: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b3d 30575 1726867599.06332: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867599.06406: no more pending results, returning what we have 30575 1726867599.06413: results queue empty 30575 1726867599.06414: checking for any_errors_fatal 30575 1726867599.06418: done checking for any_errors_fatal 30575 1726867599.06421: checking for max_fail_percentage 30575 1726867599.06423: done checking for max_fail_percentage 30575 1726867599.06425: checking to see if all hosts have failed and the running result is not ok 30575 1726867599.06426: done checking to see if all hosts have failed 30575 1726867599.06427: getting the remaining hosts for this loop 30575 1726867599.06428: done getting the remaining hosts for this loop 30575 1726867599.06432: getting the next task for host managed_node3 30575 1726867599.06440: done getting next task for host managed_node3 30575 1726867599.06444: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867599.06449: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867599.06467: getting variables 30575 1726867599.06472: in VariableManager get_vars() 30575 1726867599.06511: Calling all_inventory to load vars for managed_node3 30575 1726867599.06518: Calling groups_inventory to load vars for managed_node3 30575 1726867599.06520: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867599.06535: Calling all_plugins_play to load vars for managed_node3 30575 1726867599.06538: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867599.06542: Calling groups_plugins_play to load vars for managed_node3 30575 1726867599.07474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867599.08857: done with get_vars() 30575 1726867599.08879: done getting variables 30575 1726867599.08939: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:26:39 -0400 (0:00:00.045) 0:00:34.467 ****** 30575 1726867599.08974: entering _queue_task() for managed_node3/service 30575 1726867599.09245: worker is 1 (out of 1 available) 30575 1726867599.09259: exiting _queue_task() for managed_node3/service 30575 1726867599.09272: done queuing things up, now waiting for results queue to drain 30575 1726867599.09274: waiting for pending results... 30575 1726867599.09576: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867599.09662: in run() - task 0affcac9-a3a5-e081-a588-000000000b3e 30575 1726867599.09682: variable 'ansible_search_path' from source: unknown 30575 1726867599.09690: variable 'ansible_search_path' from source: unknown 30575 1726867599.09693: calling self._execute() 30575 1726867599.09868: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867599.09874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867599.09878: variable 'omit' from source: magic vars 30575 1726867599.10176: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.10194: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867599.10311: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867599.10521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867599.12426: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867599.12869: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867599.12907: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867599.12954: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867599.12980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867599.13080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.13086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.13191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.13195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.13198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.13332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.13336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.13338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.13340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.13342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.13364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.13407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.13430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.13458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.13503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.13730: variable 'network_connections' from source: include params 30575 1726867599.13733: variable 'interface' from source: play vars 30575 1726867599.13767: variable 'interface' from source: play vars 30575 1726867599.13847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867599.13997: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867599.14038: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867599.14065: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867599.14109: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867599.14165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867599.14200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867599.14259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.14266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867599.14299: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867599.14520: variable 'network_connections' from source: include params 30575 1726867599.14526: variable 'interface' from source: play vars 30575 1726867599.14608: variable 'interface' from source: play vars 30575 1726867599.14633: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867599.14637: when evaluation is False, skipping this task 30575 1726867599.14639: _execute() done 30575 1726867599.14641: dumping result to json 30575 1726867599.14644: done dumping result, returning 30575 1726867599.14646: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000b3e] 30575 1726867599.14648: sending task result for task 0affcac9-a3a5-e081-a588-000000000b3e 30575 1726867599.14963: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b3e 30575 1726867599.14970: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867599.15034: no more pending results, returning what we have 30575 1726867599.15038: results queue empty 30575 1726867599.15039: checking for any_errors_fatal 30575 1726867599.15045: done checking for any_errors_fatal 30575 1726867599.15045: checking for max_fail_percentage 30575 1726867599.15050: done checking for max_fail_percentage 30575 1726867599.15051: checking to see if all hosts have failed and the running result is not ok 30575 1726867599.15052: done checking to see if all hosts have failed 30575 1726867599.15053: getting the remaining hosts for this loop 30575 1726867599.15054: done getting the remaining hosts for this loop 30575 1726867599.15061: getting the next task for host managed_node3 30575 1726867599.15066: done getting next task for host managed_node3 30575 1726867599.15072: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867599.15080: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867599.15099: getting variables 30575 1726867599.15100: in VariableManager get_vars() 30575 1726867599.15133: Calling all_inventory to load vars for managed_node3 30575 1726867599.15135: Calling groups_inventory to load vars for managed_node3 30575 1726867599.15137: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867599.15143: Calling all_plugins_play to load vars for managed_node3 30575 1726867599.15145: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867599.15146: Calling groups_plugins_play to load vars for managed_node3 30575 1726867599.16260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867599.17184: done with get_vars() 30575 1726867599.17198: done getting variables 30575 1726867599.17238: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:26:39 -0400 (0:00:00.082) 0:00:34.550 ****** 30575 1726867599.17263: entering _queue_task() for managed_node3/service 30575 1726867599.17469: worker is 1 (out of 1 available) 30575 1726867599.17483: exiting _queue_task() for managed_node3/service 30575 1726867599.17495: done queuing things up, now waiting for results queue to drain 30575 1726867599.17497: waiting for pending results... 30575 1726867599.17724: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867599.17822: in run() - task 0affcac9-a3a5-e081-a588-000000000b3f 30575 1726867599.17834: variable 'ansible_search_path' from source: unknown 30575 1726867599.17838: variable 'ansible_search_path' from source: unknown 30575 1726867599.17869: calling self._execute() 30575 1726867599.17976: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867599.17992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867599.17995: variable 'omit' from source: magic vars 30575 1726867599.18435: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.18439: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867599.18683: variable 'network_provider' from source: set_fact 30575 1726867599.18704: variable 'network_state' from source: role '' defaults 30575 1726867599.18726: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867599.18743: variable 'omit' from source: magic vars 30575 1726867599.18842: variable 'omit' from source: magic vars 30575 1726867599.18886: variable 'network_service_name' from source: role '' defaults 30575 1726867599.18951: variable 'network_service_name' from source: role '' defaults 30575 1726867599.19038: variable '__network_provider_setup' from source: role '' defaults 30575 1726867599.19041: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867599.19119: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867599.19135: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867599.19232: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867599.19479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867599.21793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867599.21859: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867599.21933: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867599.21981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867599.22018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867599.22129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.22174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.22197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.22250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.22254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.22301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.22321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.22360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.22381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.22393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.22575: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867599.22668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.22685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.22708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.22754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.22789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.22904: variable 'ansible_python' from source: facts 30575 1726867599.22906: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867599.23034: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867599.23109: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867599.23241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.23289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.23293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.23355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.23383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.23446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.23455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.23563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.23567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.23569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.23760: variable 'network_connections' from source: include params 30575 1726867599.23764: variable 'interface' from source: play vars 30575 1726867599.23885: variable 'interface' from source: play vars 30575 1726867599.23988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867599.24221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867599.24282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867599.24337: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867599.24399: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867599.24481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867599.24518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867599.24550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.24595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867599.24670: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867599.25019: variable 'network_connections' from source: include params 30575 1726867599.25022: variable 'interface' from source: play vars 30575 1726867599.25131: variable 'interface' from source: play vars 30575 1726867599.25147: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867599.25243: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867599.25641: variable 'network_connections' from source: include params 30575 1726867599.25645: variable 'interface' from source: play vars 30575 1726867599.25716: variable 'interface' from source: play vars 30575 1726867599.25737: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867599.25845: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867599.26194: variable 'network_connections' from source: include params 30575 1726867599.26197: variable 'interface' from source: play vars 30575 1726867599.26255: variable 'interface' from source: play vars 30575 1726867599.26298: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867599.26366: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867599.26372: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867599.26440: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867599.26671: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867599.27240: variable 'network_connections' from source: include params 30575 1726867599.27243: variable 'interface' from source: play vars 30575 1726867599.27325: variable 'interface' from source: play vars 30575 1726867599.27328: variable 'ansible_distribution' from source: facts 30575 1726867599.27333: variable '__network_rh_distros' from source: role '' defaults 30575 1726867599.27335: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.27379: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867599.27555: variable 'ansible_distribution' from source: facts 30575 1726867599.27561: variable '__network_rh_distros' from source: role '' defaults 30575 1726867599.27564: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.27566: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867599.27692: variable 'ansible_distribution' from source: facts 30575 1726867599.27695: variable '__network_rh_distros' from source: role '' defaults 30575 1726867599.27699: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.27736: variable 'network_provider' from source: set_fact 30575 1726867599.27752: variable 'omit' from source: magic vars 30575 1726867599.27772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867599.27793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867599.27808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867599.27821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867599.27830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867599.27854: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867599.27857: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867599.27859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867599.27928: Set connection var ansible_pipelining to False 30575 1726867599.27931: Set connection var ansible_shell_type to sh 30575 1726867599.27933: Set connection var ansible_shell_executable to /bin/sh 30575 1726867599.27938: Set connection var ansible_timeout to 10 30575 1726867599.27943: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867599.27949: Set connection var ansible_connection to ssh 30575 1726867599.27971: variable 'ansible_shell_executable' from source: unknown 30575 1726867599.27974: variable 'ansible_connection' from source: unknown 30575 1726867599.27978: variable 'ansible_module_compression' from source: unknown 30575 1726867599.27980: variable 'ansible_shell_type' from source: unknown 30575 1726867599.27983: variable 'ansible_shell_executable' from source: unknown 30575 1726867599.27985: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867599.27987: variable 'ansible_pipelining' from source: unknown 30575 1726867599.27989: variable 'ansible_timeout' from source: unknown 30575 1726867599.27991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867599.28061: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867599.28070: variable 'omit' from source: magic vars 30575 1726867599.28076: starting attempt loop 30575 1726867599.28081: running the handler 30575 1726867599.28136: variable 'ansible_facts' from source: unknown 30575 1726867599.28594: _low_level_execute_command(): starting 30575 1726867599.28600: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867599.29111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.29115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.29117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.29119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.29174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867599.29188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867599.29190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867599.29230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867599.30906: stdout chunk (state=3): >>>/root <<< 30575 1726867599.31028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867599.31040: stderr chunk (state=3): >>><<< 30575 1726867599.31043: stdout chunk (state=3): >>><<< 30575 1726867599.31063: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867599.31080: _low_level_execute_command(): starting 30575 1726867599.31084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589 `" && echo ansible-tmp-1726867599.3106065-32209-205371929827589="` echo /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589 `" ) && sleep 0' 30575 1726867599.31582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867599.31612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867599.31620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867599.31627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.31630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867599.31632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867599.31633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.31675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867599.31683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867599.31731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867599.33609: stdout chunk (state=3): >>>ansible-tmp-1726867599.3106065-32209-205371929827589=/root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589 <<< 30575 1726867599.33738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867599.33747: stderr chunk (state=3): >>><<< 30575 1726867599.33749: stdout chunk (state=3): >>><<< 30575 1726867599.33763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867599.3106065-32209-205371929827589=/root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867599.33788: variable 'ansible_module_compression' from source: unknown 30575 1726867599.33825: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867599.33874: variable 'ansible_facts' from source: unknown 30575 1726867599.34014: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/AnsiballZ_systemd.py 30575 1726867599.34101: Sending initial data 30575 1726867599.34106: Sent initial data (156 bytes) 30575 1726867599.34526: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.34532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.34535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867599.34537: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867599.34539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.34583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867599.34586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867599.34636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867599.36156: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867599.36167: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867599.36210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867599.36254: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpcbef5s77 /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/AnsiballZ_systemd.py <<< 30575 1726867599.36257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/AnsiballZ_systemd.py" <<< 30575 1726867599.36296: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpcbef5s77" to remote "/root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/AnsiballZ_systemd.py" <<< 30575 1726867599.36299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/AnsiballZ_systemd.py" <<< 30575 1726867599.37381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867599.37413: stderr chunk (state=3): >>><<< 30575 1726867599.37416: stdout chunk (state=3): >>><<< 30575 1726867599.37452: done transferring module to remote 30575 1726867599.37459: _low_level_execute_command(): starting 30575 1726867599.37462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/ /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/AnsiballZ_systemd.py && sleep 0' 30575 1726867599.37854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.37920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867599.37924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867599.37926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.37928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.37930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867599.37933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.37954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867599.37957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867599.38015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867599.39748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867599.39776: stderr chunk (state=3): >>><<< 30575 1726867599.39782: stdout chunk (state=3): >>><<< 30575 1726867599.39791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867599.39794: _low_level_execute_command(): starting 30575 1726867599.39797: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/AnsiballZ_systemd.py && sleep 0' 30575 1726867599.40262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.40321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867599.40327: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867599.40329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.40331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.40337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867599.40408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867599.40412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867599.40441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867599.69427: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316994048", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1800383000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30575 1726867599.69474: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867599.71484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867599.71683: stderr chunk (state=3): >>><<< 30575 1726867599.71686: stdout chunk (state=3): >>><<< 30575 1726867599.71690: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316994048", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1800383000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867599.71857: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867599.71885: _low_level_execute_command(): starting 30575 1726867599.71904: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867599.3106065-32209-205371929827589/ > /dev/null 2>&1 && sleep 0' 30575 1726867599.72505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867599.72518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867599.72536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867599.72556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867599.72574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867599.72590: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867599.72675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867599.72690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867599.72704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867599.72776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867599.74616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867599.74698: stderr chunk (state=3): >>><<< 30575 1726867599.74708: stdout chunk (state=3): >>><<< 30575 1726867599.74725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867599.74736: handler run complete 30575 1726867599.74823: attempt loop complete, returning result 30575 1726867599.74845: _execute() done 30575 1726867599.74854: dumping result to json 30575 1726867599.74885: done dumping result, returning 30575 1726867599.74921: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000000b3f] 30575 1726867599.74941: sending task result for task 0affcac9-a3a5-e081-a588-000000000b3f ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867599.75297: no more pending results, returning what we have 30575 1726867599.75300: results queue empty 30575 1726867599.75301: checking for any_errors_fatal 30575 1726867599.75309: done checking for any_errors_fatal 30575 1726867599.75309: checking for max_fail_percentage 30575 1726867599.75311: done checking for max_fail_percentage 30575 1726867599.75312: checking to see if all hosts have failed and the running result is not ok 30575 1726867599.75313: done checking to see if all hosts have failed 30575 1726867599.75314: getting the remaining hosts for this loop 30575 1726867599.75316: done getting the remaining hosts for this loop 30575 1726867599.75332: getting the next task for host managed_node3 30575 1726867599.75347: done getting next task for host managed_node3 30575 1726867599.75354: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867599.75361: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867599.75381: getting variables 30575 1726867599.75383: in VariableManager get_vars() 30575 1726867599.75425: Calling all_inventory to load vars for managed_node3 30575 1726867599.75429: Calling groups_inventory to load vars for managed_node3 30575 1726867599.75585: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867599.75601: Calling all_plugins_play to load vars for managed_node3 30575 1726867599.75604: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867599.75606: Calling groups_plugins_play to load vars for managed_node3 30575 1726867599.76303: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b3f 30575 1726867599.76306: WORKER PROCESS EXITING 30575 1726867599.78327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867599.80360: done with get_vars() 30575 1726867599.80384: done getting variables 30575 1726867599.80444: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:26:39 -0400 (0:00:00.632) 0:00:35.182 ****** 30575 1726867599.80485: entering _queue_task() for managed_node3/service 30575 1726867599.80915: worker is 1 (out of 1 available) 30575 1726867599.80926: exiting _queue_task() for managed_node3/service 30575 1726867599.80937: done queuing things up, now waiting for results queue to drain 30575 1726867599.80939: waiting for pending results... 30575 1726867599.81175: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867599.81331: in run() - task 0affcac9-a3a5-e081-a588-000000000b40 30575 1726867599.81350: variable 'ansible_search_path' from source: unknown 30575 1726867599.81358: variable 'ansible_search_path' from source: unknown 30575 1726867599.81405: calling self._execute() 30575 1726867599.81494: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867599.81510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867599.81584: variable 'omit' from source: magic vars 30575 1726867599.82547: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.82610: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867599.82885: variable 'network_provider' from source: set_fact 30575 1726867599.82919: Evaluated conditional (network_provider == "nm"): True 30575 1726867599.83227: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867599.83471: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867599.83676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867599.86675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867599.86749: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867599.86791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867599.86859: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867599.86890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867599.86989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.87027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.87069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.87111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.87155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.87183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.87212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.87272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.87391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.87397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.87803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867599.87807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867599.87809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.87812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867599.87814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867599.88283: variable 'network_connections' from source: include params 30575 1726867599.88288: variable 'interface' from source: play vars 30575 1726867599.88290: variable 'interface' from source: play vars 30575 1726867599.88597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867599.89159: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867599.89341: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867599.89652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867599.89707: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867599.89889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867599.89984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867599.90016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867599.90076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867599.90209: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867599.90746: variable 'network_connections' from source: include params 30575 1726867599.90755: variable 'interface' from source: play vars 30575 1726867599.90846: variable 'interface' from source: play vars 30575 1726867599.90948: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867599.90987: when evaluation is False, skipping this task 30575 1726867599.90996: _execute() done 30575 1726867599.91005: dumping result to json 30575 1726867599.91013: done dumping result, returning 30575 1726867599.91031: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000000b40] 30575 1726867599.91050: sending task result for task 0affcac9-a3a5-e081-a588-000000000b40 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867599.91296: no more pending results, returning what we have 30575 1726867599.91300: results queue empty 30575 1726867599.91301: checking for any_errors_fatal 30575 1726867599.91330: done checking for any_errors_fatal 30575 1726867599.91332: checking for max_fail_percentage 30575 1726867599.91334: done checking for max_fail_percentage 30575 1726867599.91335: checking to see if all hosts have failed and the running result is not ok 30575 1726867599.91336: done checking to see if all hosts have failed 30575 1726867599.91337: getting the remaining hosts for this loop 30575 1726867599.91338: done getting the remaining hosts for this loop 30575 1726867599.91343: getting the next task for host managed_node3 30575 1726867599.91352: done getting next task for host managed_node3 30575 1726867599.91357: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867599.91362: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867599.91387: getting variables 30575 1726867599.91390: in VariableManager get_vars() 30575 1726867599.91429: Calling all_inventory to load vars for managed_node3 30575 1726867599.91432: Calling groups_inventory to load vars for managed_node3 30575 1726867599.91434: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867599.91445: Calling all_plugins_play to load vars for managed_node3 30575 1726867599.91448: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867599.91451: Calling groups_plugins_play to load vars for managed_node3 30575 1726867599.92003: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b40 30575 1726867599.92006: WORKER PROCESS EXITING 30575 1726867599.94164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867599.96870: done with get_vars() 30575 1726867599.96905: done getting variables 30575 1726867599.97069: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:26:39 -0400 (0:00:00.167) 0:00:35.349 ****** 30575 1726867599.97206: entering _queue_task() for managed_node3/service 30575 1726867599.97840: worker is 1 (out of 1 available) 30575 1726867599.97850: exiting _queue_task() for managed_node3/service 30575 1726867599.97862: done queuing things up, now waiting for results queue to drain 30575 1726867599.97864: waiting for pending results... 30575 1726867599.98056: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867599.98218: in run() - task 0affcac9-a3a5-e081-a588-000000000b41 30575 1726867599.98238: variable 'ansible_search_path' from source: unknown 30575 1726867599.98246: variable 'ansible_search_path' from source: unknown 30575 1726867599.98296: calling self._execute() 30575 1726867599.98444: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867599.98455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867599.98480: variable 'omit' from source: magic vars 30575 1726867599.99649: variable 'ansible_distribution_major_version' from source: facts 30575 1726867599.99653: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867600.00096: variable 'network_provider' from source: set_fact 30575 1726867600.00110: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867600.00119: when evaluation is False, skipping this task 30575 1726867600.00128: _execute() done 30575 1726867600.00137: dumping result to json 30575 1726867600.00148: done dumping result, returning 30575 1726867600.00161: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000000b41] 30575 1726867600.00200: sending task result for task 0affcac9-a3a5-e081-a588-000000000b41 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867600.00429: no more pending results, returning what we have 30575 1726867600.00433: results queue empty 30575 1726867600.00434: checking for any_errors_fatal 30575 1726867600.00445: done checking for any_errors_fatal 30575 1726867600.00446: checking for max_fail_percentage 30575 1726867600.00448: done checking for max_fail_percentage 30575 1726867600.00449: checking to see if all hosts have failed and the running result is not ok 30575 1726867600.00450: done checking to see if all hosts have failed 30575 1726867600.00451: getting the remaining hosts for this loop 30575 1726867600.00452: done getting the remaining hosts for this loop 30575 1726867600.00456: getting the next task for host managed_node3 30575 1726867600.00467: done getting next task for host managed_node3 30575 1726867600.00472: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867600.00479: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867600.00508: getting variables 30575 1726867600.00511: in VariableManager get_vars() 30575 1726867600.00550: Calling all_inventory to load vars for managed_node3 30575 1726867600.00553: Calling groups_inventory to load vars for managed_node3 30575 1726867600.00556: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867600.00569: Calling all_plugins_play to load vars for managed_node3 30575 1726867600.00800: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867600.00902: Calling groups_plugins_play to load vars for managed_node3 30575 1726867600.01607: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b41 30575 1726867600.01611: WORKER PROCESS EXITING 30575 1726867600.07063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867600.11290: done with get_vars() 30575 1726867600.11318: done getting variables 30575 1726867600.11405: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:26:40 -0400 (0:00:00.142) 0:00:35.492 ****** 30575 1726867600.11447: entering _queue_task() for managed_node3/copy 30575 1726867600.12009: worker is 1 (out of 1 available) 30575 1726867600.12020: exiting _queue_task() for managed_node3/copy 30575 1726867600.12031: done queuing things up, now waiting for results queue to drain 30575 1726867600.12033: waiting for pending results... 30575 1726867600.12383: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867600.12583: in run() - task 0affcac9-a3a5-e081-a588-000000000b42 30575 1726867600.12604: variable 'ansible_search_path' from source: unknown 30575 1726867600.12613: variable 'ansible_search_path' from source: unknown 30575 1726867600.12670: calling self._execute() 30575 1726867600.12780: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867600.12794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867600.12809: variable 'omit' from source: magic vars 30575 1726867600.13236: variable 'ansible_distribution_major_version' from source: facts 30575 1726867600.13253: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867600.13387: variable 'network_provider' from source: set_fact 30575 1726867600.13409: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867600.13418: when evaluation is False, skipping this task 30575 1726867600.13482: _execute() done 30575 1726867600.13485: dumping result to json 30575 1726867600.13488: done dumping result, returning 30575 1726867600.13491: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000000b42] 30575 1726867600.13493: sending task result for task 0affcac9-a3a5-e081-a588-000000000b42 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867600.13780: no more pending results, returning what we have 30575 1726867600.13785: results queue empty 30575 1726867600.13786: checking for any_errors_fatal 30575 1726867600.13791: done checking for any_errors_fatal 30575 1726867600.13792: checking for max_fail_percentage 30575 1726867600.13794: done checking for max_fail_percentage 30575 1726867600.13795: checking to see if all hosts have failed and the running result is not ok 30575 1726867600.13796: done checking to see if all hosts have failed 30575 1726867600.13797: getting the remaining hosts for this loop 30575 1726867600.13799: done getting the remaining hosts for this loop 30575 1726867600.13803: getting the next task for host managed_node3 30575 1726867600.13812: done getting next task for host managed_node3 30575 1726867600.13815: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867600.13821: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867600.13849: getting variables 30575 1726867600.13852: in VariableManager get_vars() 30575 1726867600.13899: Calling all_inventory to load vars for managed_node3 30575 1726867600.13902: Calling groups_inventory to load vars for managed_node3 30575 1726867600.13905: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867600.13912: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b42 30575 1726867600.13915: WORKER PROCESS EXITING 30575 1726867600.13931: Calling all_plugins_play to load vars for managed_node3 30575 1726867600.13934: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867600.13938: Calling groups_plugins_play to load vars for managed_node3 30575 1726867600.15462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867600.17121: done with get_vars() 30575 1726867600.17154: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:26:40 -0400 (0:00:00.057) 0:00:35.550 ****** 30575 1726867600.17248: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867600.17563: worker is 1 (out of 1 available) 30575 1726867600.17576: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867600.17698: done queuing things up, now waiting for results queue to drain 30575 1726867600.17700: waiting for pending results... 30575 1726867600.17921: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867600.18128: in run() - task 0affcac9-a3a5-e081-a588-000000000b43 30575 1726867600.18133: variable 'ansible_search_path' from source: unknown 30575 1726867600.18136: variable 'ansible_search_path' from source: unknown 30575 1726867600.18149: calling self._execute() 30575 1726867600.18261: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867600.18344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867600.18348: variable 'omit' from source: magic vars 30575 1726867600.18696: variable 'ansible_distribution_major_version' from source: facts 30575 1726867600.18712: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867600.18726: variable 'omit' from source: magic vars 30575 1726867600.18808: variable 'omit' from source: magic vars 30575 1726867600.18985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867600.21496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867600.21563: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867600.21613: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867600.21713: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867600.21716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867600.21816: variable 'network_provider' from source: set_fact 30575 1726867600.22042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867600.22046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867600.22049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867600.22098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867600.22119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867600.22209: variable 'omit' from source: magic vars 30575 1726867600.22337: variable 'omit' from source: magic vars 30575 1726867600.22497: variable 'network_connections' from source: include params 30575 1726867600.22500: variable 'interface' from source: play vars 30575 1726867600.22547: variable 'interface' from source: play vars 30575 1726867600.22722: variable 'omit' from source: magic vars 30575 1726867600.22740: variable '__lsr_ansible_managed' from source: task vars 30575 1726867600.22804: variable '__lsr_ansible_managed' from source: task vars 30575 1726867600.23147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867600.23346: Loaded config def from plugin (lookup/template) 30575 1726867600.23357: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867600.23403: File lookup term: get_ansible_managed.j2 30575 1726867600.23411: variable 'ansible_search_path' from source: unknown 30575 1726867600.23554: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867600.23563: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867600.23566: variable 'ansible_search_path' from source: unknown 30575 1726867600.36187: variable 'ansible_managed' from source: unknown 30575 1726867600.36731: variable 'omit' from source: magic vars 30575 1726867600.36734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867600.36806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867600.36839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867600.36929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867600.36949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867600.37056: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867600.37059: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867600.37063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867600.37236: Set connection var ansible_pipelining to False 30575 1726867600.37246: Set connection var ansible_shell_type to sh 30575 1726867600.37256: Set connection var ansible_shell_executable to /bin/sh 30575 1726867600.37271: Set connection var ansible_timeout to 10 30575 1726867600.37288: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867600.37300: Set connection var ansible_connection to ssh 30575 1726867600.37356: variable 'ansible_shell_executable' from source: unknown 30575 1726867600.37364: variable 'ansible_connection' from source: unknown 30575 1726867600.37371: variable 'ansible_module_compression' from source: unknown 30575 1726867600.37384: variable 'ansible_shell_type' from source: unknown 30575 1726867600.37392: variable 'ansible_shell_executable' from source: unknown 30575 1726867600.37452: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867600.37455: variable 'ansible_pipelining' from source: unknown 30575 1726867600.37458: variable 'ansible_timeout' from source: unknown 30575 1726867600.37460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867600.37582: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867600.37713: variable 'omit' from source: magic vars 30575 1726867600.37716: starting attempt loop 30575 1726867600.37719: running the handler 30575 1726867600.37722: _low_level_execute_command(): starting 30575 1726867600.37736: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867600.39027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867600.39046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867600.39105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867600.39122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867600.39261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867600.39316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867600.41071: stdout chunk (state=3): >>>/root <<< 30575 1726867600.41121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867600.41135: stdout chunk (state=3): >>><<< 30575 1726867600.41148: stderr chunk (state=3): >>><<< 30575 1726867600.41203: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867600.41384: _low_level_execute_command(): starting 30575 1726867600.41387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694 `" && echo ansible-tmp-1726867600.4120963-32260-84416641877694="` echo /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694 `" ) && sleep 0' 30575 1726867600.42317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867600.42533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867600.42656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867600.42907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867600.44652: stdout chunk (state=3): >>>ansible-tmp-1726867600.4120963-32260-84416641877694=/root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694 <<< 30575 1726867600.44814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867600.44817: stdout chunk (state=3): >>><<< 30575 1726867600.44819: stderr chunk (state=3): >>><<< 30575 1726867600.44842: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867600.4120963-32260-84416641877694=/root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867600.45009: variable 'ansible_module_compression' from source: unknown 30575 1726867600.45087: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867600.45207: variable 'ansible_facts' from source: unknown 30575 1726867600.45662: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/AnsiballZ_network_connections.py 30575 1726867600.46100: Sending initial data 30575 1726867600.46103: Sent initial data (167 bytes) 30575 1726867600.47386: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867600.47389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867600.47391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867600.47394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867600.47416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867600.47654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867600.47730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867600.48165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867600.49517: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30575 1726867600.49545: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867600.49595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867600.49812: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp385m6bdh /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/AnsiballZ_network_connections.py <<< 30575 1726867600.49834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/AnsiballZ_network_connections.py" <<< 30575 1726867600.49909: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp385m6bdh" to remote "/root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/AnsiballZ_network_connections.py" <<< 30575 1726867600.51961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867600.52109: stderr chunk (state=3): >>><<< 30575 1726867600.52112: stdout chunk (state=3): >>><<< 30575 1726867600.52288: done transferring module to remote 30575 1726867600.52291: _low_level_execute_command(): starting 30575 1726867600.52293: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/ /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/AnsiballZ_network_connections.py && sleep 0' 30575 1726867600.53472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867600.53495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867600.53506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867600.53609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867600.53633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867600.53769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867600.55698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867600.55745: stderr chunk (state=3): >>><<< 30575 1726867600.55790: stdout chunk (state=3): >>><<< 30575 1726867600.55838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867600.56025: _low_level_execute_command(): starting 30575 1726867600.56028: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/AnsiballZ_network_connections.py && sleep 0' 30575 1726867600.56805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867600.56812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867600.56833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867600.56908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867600.56955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867600.56968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867600.56976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867600.57159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867600.84352: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867600.87022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867600.87408: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867600.87413: stdout chunk (state=3): >>><<< 30575 1726867600.87415: stderr chunk (state=3): >>><<< 30575 1726867600.87423: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867600.87426: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867600.87429: _low_level_execute_command(): starting 30575 1726867600.87433: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867600.4120963-32260-84416641877694/ > /dev/null 2>&1 && sleep 0' 30575 1726867600.88399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867600.88554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867600.88659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867600.90476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867600.90516: stderr chunk (state=3): >>><<< 30575 1726867600.90551: stdout chunk (state=3): >>><<< 30575 1726867600.90594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867600.90599: handler run complete 30575 1726867600.90631: attempt loop complete, returning result 30575 1726867600.90634: _execute() done 30575 1726867600.90637: dumping result to json 30575 1726867600.90661: done dumping result, returning 30575 1726867600.90664: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-000000000b43] 30575 1726867600.90666: sending task result for task 0affcac9-a3a5-e081-a588-000000000b43 30575 1726867600.90934: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b43 30575 1726867600.90937: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb 30575 1726867600.91039: no more pending results, returning what we have 30575 1726867600.91042: results queue empty 30575 1726867600.91043: checking for any_errors_fatal 30575 1726867600.91054: done checking for any_errors_fatal 30575 1726867600.91059: checking for max_fail_percentage 30575 1726867600.91060: done checking for max_fail_percentage 30575 1726867600.91061: checking to see if all hosts have failed and the running result is not ok 30575 1726867600.91062: done checking to see if all hosts have failed 30575 1726867600.91063: getting the remaining hosts for this loop 30575 1726867600.91065: done getting the remaining hosts for this loop 30575 1726867600.91068: getting the next task for host managed_node3 30575 1726867600.91076: done getting next task for host managed_node3 30575 1726867600.91082: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867600.91086: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867600.91099: getting variables 30575 1726867600.91101: in VariableManager get_vars() 30575 1726867600.91137: Calling all_inventory to load vars for managed_node3 30575 1726867600.91139: Calling groups_inventory to load vars for managed_node3 30575 1726867600.91142: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867600.91150: Calling all_plugins_play to load vars for managed_node3 30575 1726867600.91153: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867600.91272: Calling groups_plugins_play to load vars for managed_node3 30575 1726867600.92931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867600.94539: done with get_vars() 30575 1726867600.94562: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:26:40 -0400 (0:00:00.774) 0:00:36.324 ****** 30575 1726867600.94651: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867600.95003: worker is 1 (out of 1 available) 30575 1726867600.95016: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867600.95032: done queuing things up, now waiting for results queue to drain 30575 1726867600.95034: waiting for pending results... 30575 1726867600.95407: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867600.95514: in run() - task 0affcac9-a3a5-e081-a588-000000000b44 30575 1726867600.95539: variable 'ansible_search_path' from source: unknown 30575 1726867600.95548: variable 'ansible_search_path' from source: unknown 30575 1726867600.95593: calling self._execute() 30575 1726867600.95720: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867600.95727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867600.95730: variable 'omit' from source: magic vars 30575 1726867600.96157: variable 'ansible_distribution_major_version' from source: facts 30575 1726867600.96160: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867600.96264: variable 'network_state' from source: role '' defaults 30575 1726867600.96284: Evaluated conditional (network_state != {}): False 30575 1726867600.96291: when evaluation is False, skipping this task 30575 1726867600.96298: _execute() done 30575 1726867600.96305: dumping result to json 30575 1726867600.96376: done dumping result, returning 30575 1726867600.96381: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000000b44] 30575 1726867600.96384: sending task result for task 0affcac9-a3a5-e081-a588-000000000b44 30575 1726867600.96456: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b44 30575 1726867600.96459: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867600.96515: no more pending results, returning what we have 30575 1726867600.96519: results queue empty 30575 1726867600.96520: checking for any_errors_fatal 30575 1726867600.96534: done checking for any_errors_fatal 30575 1726867600.96534: checking for max_fail_percentage 30575 1726867600.96536: done checking for max_fail_percentage 30575 1726867600.96537: checking to see if all hosts have failed and the running result is not ok 30575 1726867600.96538: done checking to see if all hosts have failed 30575 1726867600.96539: getting the remaining hosts for this loop 30575 1726867600.96540: done getting the remaining hosts for this loop 30575 1726867600.96544: getting the next task for host managed_node3 30575 1726867600.96553: done getting next task for host managed_node3 30575 1726867600.96556: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867600.96561: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867600.96585: getting variables 30575 1726867600.96587: in VariableManager get_vars() 30575 1726867600.96626: Calling all_inventory to load vars for managed_node3 30575 1726867600.96628: Calling groups_inventory to load vars for managed_node3 30575 1726867600.96631: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867600.96643: Calling all_plugins_play to load vars for managed_node3 30575 1726867600.96646: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867600.96649: Calling groups_plugins_play to load vars for managed_node3 30575 1726867600.98386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867600.99974: done with get_vars() 30575 1726867601.00000: done getting variables 30575 1726867601.00060: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:26:41 -0400 (0:00:00.054) 0:00:36.378 ****** 30575 1726867601.00097: entering _queue_task() for managed_node3/debug 30575 1726867601.00433: worker is 1 (out of 1 available) 30575 1726867601.00447: exiting _queue_task() for managed_node3/debug 30575 1726867601.00459: done queuing things up, now waiting for results queue to drain 30575 1726867601.00460: waiting for pending results... 30575 1726867601.00897: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867601.00902: in run() - task 0affcac9-a3a5-e081-a588-000000000b45 30575 1726867601.00910: variable 'ansible_search_path' from source: unknown 30575 1726867601.00916: variable 'ansible_search_path' from source: unknown 30575 1726867601.00956: calling self._execute() 30575 1726867601.01083: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.01086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.01088: variable 'omit' from source: magic vars 30575 1726867601.01430: variable 'ansible_distribution_major_version' from source: facts 30575 1726867601.01450: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867601.01461: variable 'omit' from source: magic vars 30575 1726867601.01532: variable 'omit' from source: magic vars 30575 1726867601.01639: variable 'omit' from source: magic vars 30575 1726867601.01642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867601.01664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867601.01689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867601.01708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.01721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.01765: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867601.01774: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.01784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.01899: Set connection var ansible_pipelining to False 30575 1726867601.01908: Set connection var ansible_shell_type to sh 30575 1726867601.01920: Set connection var ansible_shell_executable to /bin/sh 30575 1726867601.01935: Set connection var ansible_timeout to 10 30575 1726867601.01946: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867601.01963: Set connection var ansible_connection to ssh 30575 1726867601.02073: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.02076: variable 'ansible_connection' from source: unknown 30575 1726867601.02080: variable 'ansible_module_compression' from source: unknown 30575 1726867601.02082: variable 'ansible_shell_type' from source: unknown 30575 1726867601.02087: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.02090: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.02091: variable 'ansible_pipelining' from source: unknown 30575 1726867601.02093: variable 'ansible_timeout' from source: unknown 30575 1726867601.02095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.02165: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867601.02185: variable 'omit' from source: magic vars 30575 1726867601.02199: starting attempt loop 30575 1726867601.02206: running the handler 30575 1726867601.02339: variable '__network_connections_result' from source: set_fact 30575 1726867601.02392: handler run complete 30575 1726867601.02430: attempt loop complete, returning result 30575 1726867601.02439: _execute() done 30575 1726867601.02445: dumping result to json 30575 1726867601.02454: done dumping result, returning 30575 1726867601.02468: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-000000000b45] 30575 1726867601.02480: sending task result for task 0affcac9-a3a5-e081-a588-000000000b45 30575 1726867601.02588: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b45 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb" ] } 30575 1726867601.02681: no more pending results, returning what we have 30575 1726867601.02685: results queue empty 30575 1726867601.02686: checking for any_errors_fatal 30575 1726867601.02690: done checking for any_errors_fatal 30575 1726867601.02690: checking for max_fail_percentage 30575 1726867601.02692: done checking for max_fail_percentage 30575 1726867601.02693: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.02694: done checking to see if all hosts have failed 30575 1726867601.02694: getting the remaining hosts for this loop 30575 1726867601.02696: done getting the remaining hosts for this loop 30575 1726867601.02699: getting the next task for host managed_node3 30575 1726867601.02707: done getting next task for host managed_node3 30575 1726867601.02711: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867601.02716: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.02730: getting variables 30575 1726867601.02732: in VariableManager get_vars() 30575 1726867601.02766: Calling all_inventory to load vars for managed_node3 30575 1726867601.02769: Calling groups_inventory to load vars for managed_node3 30575 1726867601.02772: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.03010: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.03014: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.03018: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.03661: WORKER PROCESS EXITING 30575 1726867601.04626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.07690: done with get_vars() 30575 1726867601.07762: done getting variables 30575 1726867601.07853: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:26:41 -0400 (0:00:00.077) 0:00:36.456 ****** 30575 1726867601.07894: entering _queue_task() for managed_node3/debug 30575 1726867601.08302: worker is 1 (out of 1 available) 30575 1726867601.08315: exiting _queue_task() for managed_node3/debug 30575 1726867601.08328: done queuing things up, now waiting for results queue to drain 30575 1726867601.08330: waiting for pending results... 30575 1726867601.08673: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867601.08817: in run() - task 0affcac9-a3a5-e081-a588-000000000b46 30575 1726867601.08838: variable 'ansible_search_path' from source: unknown 30575 1726867601.08847: variable 'ansible_search_path' from source: unknown 30575 1726867601.08894: calling self._execute() 30575 1726867601.08996: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.09012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.09071: variable 'omit' from source: magic vars 30575 1726867601.09646: variable 'ansible_distribution_major_version' from source: facts 30575 1726867601.09663: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867601.09675: variable 'omit' from source: magic vars 30575 1726867601.09734: variable 'omit' from source: magic vars 30575 1726867601.10085: variable 'omit' from source: magic vars 30575 1726867601.10088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867601.10091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867601.10094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867601.10096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.10198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.10235: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867601.10244: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.10252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.10351: Set connection var ansible_pipelining to False 30575 1726867601.10489: Set connection var ansible_shell_type to sh 30575 1726867601.10502: Set connection var ansible_shell_executable to /bin/sh 30575 1726867601.10514: Set connection var ansible_timeout to 10 30575 1726867601.10526: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867601.10538: Set connection var ansible_connection to ssh 30575 1726867601.10564: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.10571: variable 'ansible_connection' from source: unknown 30575 1726867601.10580: variable 'ansible_module_compression' from source: unknown 30575 1726867601.10590: variable 'ansible_shell_type' from source: unknown 30575 1726867601.10596: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.10632: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.10640: variable 'ansible_pipelining' from source: unknown 30575 1726867601.10646: variable 'ansible_timeout' from source: unknown 30575 1726867601.10654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.11065: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867601.11069: variable 'omit' from source: magic vars 30575 1726867601.11072: starting attempt loop 30575 1726867601.11074: running the handler 30575 1726867601.11078: variable '__network_connections_result' from source: set_fact 30575 1726867601.11236: variable '__network_connections_result' from source: set_fact 30575 1726867601.11393: handler run complete 30575 1726867601.11421: attempt loop complete, returning result 30575 1726867601.11427: _execute() done 30575 1726867601.11433: dumping result to json 30575 1726867601.11440: done dumping result, returning 30575 1726867601.11451: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-000000000b46] 30575 1726867601.11470: sending task result for task 0affcac9-a3a5-e081-a588-000000000b46 30575 1726867601.11890: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b46 30575 1726867601.11898: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb" ] } } 30575 1726867601.11989: no more pending results, returning what we have 30575 1726867601.11992: results queue empty 30575 1726867601.11993: checking for any_errors_fatal 30575 1726867601.11999: done checking for any_errors_fatal 30575 1726867601.12000: checking for max_fail_percentage 30575 1726867601.12003: done checking for max_fail_percentage 30575 1726867601.12004: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.12005: done checking to see if all hosts have failed 30575 1726867601.12005: getting the remaining hosts for this loop 30575 1726867601.12007: done getting the remaining hosts for this loop 30575 1726867601.12010: getting the next task for host managed_node3 30575 1726867601.12017: done getting next task for host managed_node3 30575 1726867601.12021: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867601.12026: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.12037: getting variables 30575 1726867601.12039: in VariableManager get_vars() 30575 1726867601.12299: Calling all_inventory to load vars for managed_node3 30575 1726867601.12302: Calling groups_inventory to load vars for managed_node3 30575 1726867601.12305: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.12315: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.12317: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.12320: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.15513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.20325: done with get_vars() 30575 1726867601.20349: done getting variables 30575 1726867601.20412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:26:41 -0400 (0:00:00.125) 0:00:36.582 ****** 30575 1726867601.20449: entering _queue_task() for managed_node3/debug 30575 1726867601.21391: worker is 1 (out of 1 available) 30575 1726867601.21403: exiting _queue_task() for managed_node3/debug 30575 1726867601.21415: done queuing things up, now waiting for results queue to drain 30575 1726867601.21416: waiting for pending results... 30575 1726867601.22045: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867601.22323: in run() - task 0affcac9-a3a5-e081-a588-000000000b47 30575 1726867601.22327: variable 'ansible_search_path' from source: unknown 30575 1726867601.22330: variable 'ansible_search_path' from source: unknown 30575 1726867601.22390: calling self._execute() 30575 1726867601.22650: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.22654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.22657: variable 'omit' from source: magic vars 30575 1726867601.23083: variable 'ansible_distribution_major_version' from source: facts 30575 1726867601.23087: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867601.23090: variable 'network_state' from source: role '' defaults 30575 1726867601.23092: Evaluated conditional (network_state != {}): False 30575 1726867601.23094: when evaluation is False, skipping this task 30575 1726867601.23095: _execute() done 30575 1726867601.23098: dumping result to json 30575 1726867601.23099: done dumping result, returning 30575 1726867601.23102: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000000b47] 30575 1726867601.23103: sending task result for task 0affcac9-a3a5-e081-a588-000000000b47 30575 1726867601.23161: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b47 30575 1726867601.23164: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867601.23210: no more pending results, returning what we have 30575 1726867601.23214: results queue empty 30575 1726867601.23215: checking for any_errors_fatal 30575 1726867601.23225: done checking for any_errors_fatal 30575 1726867601.23226: checking for max_fail_percentage 30575 1726867601.23227: done checking for max_fail_percentage 30575 1726867601.23228: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.23229: done checking to see if all hosts have failed 30575 1726867601.23230: getting the remaining hosts for this loop 30575 1726867601.23231: done getting the remaining hosts for this loop 30575 1726867601.23235: getting the next task for host managed_node3 30575 1726867601.23242: done getting next task for host managed_node3 30575 1726867601.23246: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867601.23251: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.23271: getting variables 30575 1726867601.23273: in VariableManager get_vars() 30575 1726867601.23305: Calling all_inventory to load vars for managed_node3 30575 1726867601.23307: Calling groups_inventory to load vars for managed_node3 30575 1726867601.23309: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.23318: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.23320: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.23322: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.25750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.27401: done with get_vars() 30575 1726867601.27422: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:26:41 -0400 (0:00:00.070) 0:00:36.652 ****** 30575 1726867601.27521: entering _queue_task() for managed_node3/ping 30575 1726867601.27845: worker is 1 (out of 1 available) 30575 1726867601.27858: exiting _queue_task() for managed_node3/ping 30575 1726867601.27870: done queuing things up, now waiting for results queue to drain 30575 1726867601.27872: waiting for pending results... 30575 1726867601.28166: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867601.28321: in run() - task 0affcac9-a3a5-e081-a588-000000000b48 30575 1726867601.28342: variable 'ansible_search_path' from source: unknown 30575 1726867601.28350: variable 'ansible_search_path' from source: unknown 30575 1726867601.28393: calling self._execute() 30575 1726867601.28713: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.28716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.28718: variable 'omit' from source: magic vars 30575 1726867601.28997: variable 'ansible_distribution_major_version' from source: facts 30575 1726867601.29039: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867601.29134: variable 'omit' from source: magic vars 30575 1726867601.29384: variable 'omit' from source: magic vars 30575 1726867601.29387: variable 'omit' from source: magic vars 30575 1726867601.29389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867601.29501: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867601.29525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867601.29547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.29602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.29637: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867601.29707: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.29716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.29922: Set connection var ansible_pipelining to False 30575 1726867601.29931: Set connection var ansible_shell_type to sh 30575 1726867601.29940: Set connection var ansible_shell_executable to /bin/sh 30575 1726867601.29948: Set connection var ansible_timeout to 10 30575 1726867601.30136: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867601.30139: Set connection var ansible_connection to ssh 30575 1726867601.30142: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.30144: variable 'ansible_connection' from source: unknown 30575 1726867601.30146: variable 'ansible_module_compression' from source: unknown 30575 1726867601.30148: variable 'ansible_shell_type' from source: unknown 30575 1726867601.30150: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.30152: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.30153: variable 'ansible_pipelining' from source: unknown 30575 1726867601.30155: variable 'ansible_timeout' from source: unknown 30575 1726867601.30157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.30574: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867601.30643: variable 'omit' from source: magic vars 30575 1726867601.30695: starting attempt loop 30575 1726867601.30703: running the handler 30575 1726867601.30722: _low_level_execute_command(): starting 30575 1726867601.30734: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867601.31476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867601.31490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867601.31502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867601.31516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867601.31549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867601.31556: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867601.31594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867601.31670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867601.31674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867601.31700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867601.31792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867601.33500: stdout chunk (state=3): >>>/root <<< 30575 1726867601.33576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867601.33994: stderr chunk (state=3): >>><<< 30575 1726867601.33997: stdout chunk (state=3): >>><<< 30575 1726867601.34001: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867601.34003: _low_level_execute_command(): starting 30575 1726867601.34006: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328 `" && echo ansible-tmp-1726867601.3379571-32324-100829796800328="` echo /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328 `" ) && sleep 0' 30575 1726867601.34881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867601.34899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867601.34971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867601.37016: stdout chunk (state=3): >>>ansible-tmp-1726867601.3379571-32324-100829796800328=/root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328 <<< 30575 1726867601.37071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867601.37075: stdout chunk (state=3): >>><<< 30575 1726867601.37285: stderr chunk (state=3): >>><<< 30575 1726867601.37303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867601.3379571-32324-100829796800328=/root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867601.37349: variable 'ansible_module_compression' from source: unknown 30575 1726867601.37401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867601.37429: variable 'ansible_facts' from source: unknown 30575 1726867601.37585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/AnsiballZ_ping.py 30575 1726867601.37806: Sending initial data 30575 1726867601.37809: Sent initial data (153 bytes) 30575 1726867601.38333: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867601.38395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867601.38411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867601.38795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867601.38801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867601.38804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867601.38806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867601.38837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867601.40384: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867601.40399: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30575 1726867601.40411: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30575 1726867601.40426: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30575 1726867601.40439: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 30575 1726867601.40455: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867601.40529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867601.40600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp32pnfa39 /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/AnsiballZ_ping.py <<< 30575 1726867601.40627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/AnsiballZ_ping.py" <<< 30575 1726867601.40668: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp32pnfa39" to remote "/root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/AnsiballZ_ping.py" <<< 30575 1726867601.41566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867601.41629: stderr chunk (state=3): >>><<< 30575 1726867601.41804: stdout chunk (state=3): >>><<< 30575 1726867601.41807: done transferring module to remote 30575 1726867601.41810: _low_level_execute_command(): starting 30575 1726867601.41813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/ /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/AnsiballZ_ping.py && sleep 0' 30575 1726867601.42713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867601.42836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867601.42839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867601.42842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867601.42844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867601.42846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867601.42869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867601.42941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867601.44686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867601.44730: stderr chunk (state=3): >>><<< 30575 1726867601.44783: stdout chunk (state=3): >>><<< 30575 1726867601.44789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867601.44792: _low_level_execute_command(): starting 30575 1726867601.44794: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/AnsiballZ_ping.py && sleep 0' 30575 1726867601.45801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867601.45885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867601.45888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867601.45890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867601.45938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867601.60907: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867601.62276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867601.62281: stdout chunk (state=3): >>><<< 30575 1726867601.62283: stderr chunk (state=3): >>><<< 30575 1726867601.62309: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867601.62449: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867601.62452: _low_level_execute_command(): starting 30575 1726867601.62455: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867601.3379571-32324-100829796800328/ > /dev/null 2>&1 && sleep 0' 30575 1726867601.63064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867601.63082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867601.63104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867601.63130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867601.63148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867601.63165: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867601.63194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867601.63212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867601.63316: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867601.63347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867601.63431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867601.65266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867601.65283: stdout chunk (state=3): >>><<< 30575 1726867601.65295: stderr chunk (state=3): >>><<< 30575 1726867601.65316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867601.65336: handler run complete 30575 1726867601.65364: attempt loop complete, returning result 30575 1726867601.65371: _execute() done 30575 1726867601.65382: dumping result to json 30575 1726867601.65390: done dumping result, returning 30575 1726867601.65406: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-000000000b48] 30575 1726867601.65415: sending task result for task 0affcac9-a3a5-e081-a588-000000000b48 ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867601.65629: done sending task result for task 0affcac9-a3a5-e081-a588-000000000b48 30575 1726867601.65632: WORKER PROCESS EXITING 30575 1726867601.65685: no more pending results, returning what we have 30575 1726867601.65689: results queue empty 30575 1726867601.65692: checking for any_errors_fatal 30575 1726867601.65698: done checking for any_errors_fatal 30575 1726867601.65698: checking for max_fail_percentage 30575 1726867601.65701: done checking for max_fail_percentage 30575 1726867601.65702: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.65703: done checking to see if all hosts have failed 30575 1726867601.65703: getting the remaining hosts for this loop 30575 1726867601.65705: done getting the remaining hosts for this loop 30575 1726867601.65708: getting the next task for host managed_node3 30575 1726867601.65719: done getting next task for host managed_node3 30575 1726867601.65722: ^ task is: TASK: meta (role_complete) 30575 1726867601.65728: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.65741: getting variables 30575 1726867601.65742: in VariableManager get_vars() 30575 1726867601.65786: Calling all_inventory to load vars for managed_node3 30575 1726867601.65789: Calling groups_inventory to load vars for managed_node3 30575 1726867601.65791: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.65801: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.65803: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.65805: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.72354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.73968: done with get_vars() 30575 1726867601.73996: done getting variables 30575 1726867601.74074: done queuing things up, now waiting for results queue to drain 30575 1726867601.74076: results queue empty 30575 1726867601.74076: checking for any_errors_fatal 30575 1726867601.74081: done checking for any_errors_fatal 30575 1726867601.74081: checking for max_fail_percentage 30575 1726867601.74082: done checking for max_fail_percentage 30575 1726867601.74083: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.74084: done checking to see if all hosts have failed 30575 1726867601.74085: getting the remaining hosts for this loop 30575 1726867601.74085: done getting the remaining hosts for this loop 30575 1726867601.74088: getting the next task for host managed_node3 30575 1726867601.74092: done getting next task for host managed_node3 30575 1726867601.74094: ^ task is: TASK: Show result 30575 1726867601.74096: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.74099: getting variables 30575 1726867601.74100: in VariableManager get_vars() 30575 1726867601.74115: Calling all_inventory to load vars for managed_node3 30575 1726867601.74118: Calling groups_inventory to load vars for managed_node3 30575 1726867601.74120: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.74128: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.74131: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.74134: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.75202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.76669: done with get_vars() 30575 1726867601.76689: done getting variables 30575 1726867601.76725: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 17:26:41 -0400 (0:00:00.492) 0:00:37.145 ****** 30575 1726867601.76749: entering _queue_task() for managed_node3/debug 30575 1726867601.77101: worker is 1 (out of 1 available) 30575 1726867601.77113: exiting _queue_task() for managed_node3/debug 30575 1726867601.77125: done queuing things up, now waiting for results queue to drain 30575 1726867601.77128: waiting for pending results... 30575 1726867601.77419: running TaskExecutor() for managed_node3/TASK: Show result 30575 1726867601.77585: in run() - task 0affcac9-a3a5-e081-a588-000000000ad2 30575 1726867601.77590: variable 'ansible_search_path' from source: unknown 30575 1726867601.77592: variable 'ansible_search_path' from source: unknown 30575 1726867601.77710: calling self._execute() 30575 1726867601.77726: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.77737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.77751: variable 'omit' from source: magic vars 30575 1726867601.78167: variable 'ansible_distribution_major_version' from source: facts 30575 1726867601.78187: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867601.78199: variable 'omit' from source: magic vars 30575 1726867601.78250: variable 'omit' from source: magic vars 30575 1726867601.78294: variable 'omit' from source: magic vars 30575 1726867601.78365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867601.78387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867601.78412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867601.78434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.78475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867601.78496: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867601.78505: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.78582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.78626: Set connection var ansible_pipelining to False 30575 1726867601.78636: Set connection var ansible_shell_type to sh 30575 1726867601.78647: Set connection var ansible_shell_executable to /bin/sh 30575 1726867601.78657: Set connection var ansible_timeout to 10 30575 1726867601.78668: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867601.78683: Set connection var ansible_connection to ssh 30575 1726867601.78715: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.78724: variable 'ansible_connection' from source: unknown 30575 1726867601.78732: variable 'ansible_module_compression' from source: unknown 30575 1726867601.78806: variable 'ansible_shell_type' from source: unknown 30575 1726867601.78809: variable 'ansible_shell_executable' from source: unknown 30575 1726867601.78811: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.78813: variable 'ansible_pipelining' from source: unknown 30575 1726867601.78815: variable 'ansible_timeout' from source: unknown 30575 1726867601.78817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.78912: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867601.78932: variable 'omit' from source: magic vars 30575 1726867601.78943: starting attempt loop 30575 1726867601.79028: running the handler 30575 1726867601.79032: variable '__network_connections_result' from source: set_fact 30575 1726867601.79090: variable '__network_connections_result' from source: set_fact 30575 1726867601.79218: handler run complete 30575 1726867601.79254: attempt loop complete, returning result 30575 1726867601.79261: _execute() done 30575 1726867601.79269: dumping result to json 30575 1726867601.79281: done dumping result, returning 30575 1726867601.79293: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-e081-a588-000000000ad2] 30575 1726867601.79303: sending task result for task 0affcac9-a3a5-e081-a588-000000000ad2 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb" ] } } 30575 1726867601.79484: no more pending results, returning what we have 30575 1726867601.79491: results queue empty 30575 1726867601.79492: checking for any_errors_fatal 30575 1726867601.79494: done checking for any_errors_fatal 30575 1726867601.79495: checking for max_fail_percentage 30575 1726867601.79497: done checking for max_fail_percentage 30575 1726867601.79498: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.79499: done checking to see if all hosts have failed 30575 1726867601.79500: getting the remaining hosts for this loop 30575 1726867601.79501: done getting the remaining hosts for this loop 30575 1726867601.79505: getting the next task for host managed_node3 30575 1726867601.79516: done getting next task for host managed_node3 30575 1726867601.79520: ^ task is: TASK: Test 30575 1726867601.79523: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.79528: getting variables 30575 1726867601.79530: in VariableManager get_vars() 30575 1726867601.79563: Calling all_inventory to load vars for managed_node3 30575 1726867601.79566: Calling groups_inventory to load vars for managed_node3 30575 1726867601.79570: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.79884: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.79888: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.79893: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.80589: done sending task result for task 0affcac9-a3a5-e081-a588-000000000ad2 30575 1726867601.80593: WORKER PROCESS EXITING 30575 1726867601.81310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.82755: done with get_vars() 30575 1726867601.82774: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:26:41 -0400 (0:00:00.061) 0:00:37.206 ****** 30575 1726867601.82865: entering _queue_task() for managed_node3/include_tasks 30575 1726867601.83145: worker is 1 (out of 1 available) 30575 1726867601.83158: exiting _queue_task() for managed_node3/include_tasks 30575 1726867601.83170: done queuing things up, now waiting for results queue to drain 30575 1726867601.83172: waiting for pending results... 30575 1726867601.83458: running TaskExecutor() for managed_node3/TASK: Test 30575 1726867601.83591: in run() - task 0affcac9-a3a5-e081-a588-000000000a4d 30575 1726867601.83617: variable 'ansible_search_path' from source: unknown 30575 1726867601.83626: variable 'ansible_search_path' from source: unknown 30575 1726867601.83681: variable 'lsr_test' from source: include params 30575 1726867601.83905: variable 'lsr_test' from source: include params 30575 1726867601.83985: variable 'omit' from source: magic vars 30575 1726867601.84120: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.84141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.84159: variable 'omit' from source: magic vars 30575 1726867601.84395: variable 'ansible_distribution_major_version' from source: facts 30575 1726867601.84410: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867601.84421: variable 'item' from source: unknown 30575 1726867601.84493: variable 'item' from source: unknown 30575 1726867601.84530: variable 'item' from source: unknown 30575 1726867601.84597: variable 'item' from source: unknown 30575 1726867601.84882: dumping result to json 30575 1726867601.84886: done dumping result, returning 30575 1726867601.84889: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-e081-a588-000000000a4d] 30575 1726867601.84891: sending task result for task 0affcac9-a3a5-e081-a588-000000000a4d 30575 1726867601.84934: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a4d 30575 1726867601.84938: WORKER PROCESS EXITING 30575 1726867601.84961: no more pending results, returning what we have 30575 1726867601.84967: in VariableManager get_vars() 30575 1726867601.85009: Calling all_inventory to load vars for managed_node3 30575 1726867601.85012: Calling groups_inventory to load vars for managed_node3 30575 1726867601.85016: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.85030: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.85032: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.85036: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.86412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.87888: done with get_vars() 30575 1726867601.87906: variable 'ansible_search_path' from source: unknown 30575 1726867601.87907: variable 'ansible_search_path' from source: unknown 30575 1726867601.87945: we have included files to process 30575 1726867601.87946: generating all_blocks data 30575 1726867601.87948: done generating all_blocks data 30575 1726867601.87953: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867601.87954: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867601.87957: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867601.88129: done processing included file 30575 1726867601.88131: iterating over new_blocks loaded from include file 30575 1726867601.88133: in VariableManager get_vars() 30575 1726867601.88147: done with get_vars() 30575 1726867601.88149: filtering new block on tags 30575 1726867601.88176: done filtering new block on tags 30575 1726867601.88180: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30575 1726867601.88185: extending task lists for all hosts with included blocks 30575 1726867601.88894: done extending task lists 30575 1726867601.88896: done processing included files 30575 1726867601.88897: results queue empty 30575 1726867601.88898: checking for any_errors_fatal 30575 1726867601.88902: done checking for any_errors_fatal 30575 1726867601.88902: checking for max_fail_percentage 30575 1726867601.88904: done checking for max_fail_percentage 30575 1726867601.88904: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.88905: done checking to see if all hosts have failed 30575 1726867601.88906: getting the remaining hosts for this loop 30575 1726867601.88907: done getting the remaining hosts for this loop 30575 1726867601.88910: getting the next task for host managed_node3 30575 1726867601.88914: done getting next task for host managed_node3 30575 1726867601.88916: ^ task is: TASK: Include network role 30575 1726867601.88919: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.88922: getting variables 30575 1726867601.88923: in VariableManager get_vars() 30575 1726867601.88933: Calling all_inventory to load vars for managed_node3 30575 1726867601.88935: Calling groups_inventory to load vars for managed_node3 30575 1726867601.88937: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.88942: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.88944: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.88947: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.90146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.91626: done with get_vars() 30575 1726867601.91645: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 17:26:41 -0400 (0:00:00.088) 0:00:37.294 ****** 30575 1726867601.91736: entering _queue_task() for managed_node3/include_role 30575 1726867601.92041: worker is 1 (out of 1 available) 30575 1726867601.92052: exiting _queue_task() for managed_node3/include_role 30575 1726867601.92063: done queuing things up, now waiting for results queue to drain 30575 1726867601.92065: waiting for pending results... 30575 1726867601.92408: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867601.92469: in run() - task 0affcac9-a3a5-e081-a588-000000000caa 30575 1726867601.92493: variable 'ansible_search_path' from source: unknown 30575 1726867601.92587: variable 'ansible_search_path' from source: unknown 30575 1726867601.92591: calling self._execute() 30575 1726867601.92655: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867601.92667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867601.92684: variable 'omit' from source: magic vars 30575 1726867601.93060: variable 'ansible_distribution_major_version' from source: facts 30575 1726867601.93078: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867601.93090: _execute() done 30575 1726867601.93100: dumping result to json 30575 1726867601.93108: done dumping result, returning 30575 1726867601.93120: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-000000000caa] 30575 1726867601.93131: sending task result for task 0affcac9-a3a5-e081-a588-000000000caa 30575 1726867601.93405: no more pending results, returning what we have 30575 1726867601.93411: in VariableManager get_vars() 30575 1726867601.93452: Calling all_inventory to load vars for managed_node3 30575 1726867601.93454: Calling groups_inventory to load vars for managed_node3 30575 1726867601.93458: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.93472: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.93476: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.93481: Calling groups_plugins_play to load vars for managed_node3 30575 1726867601.94090: done sending task result for task 0affcac9-a3a5-e081-a588-000000000caa 30575 1726867601.94094: WORKER PROCESS EXITING 30575 1726867601.95007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867601.96505: done with get_vars() 30575 1726867601.96524: variable 'ansible_search_path' from source: unknown 30575 1726867601.96526: variable 'ansible_search_path' from source: unknown 30575 1726867601.96665: variable 'omit' from source: magic vars 30575 1726867601.96710: variable 'omit' from source: magic vars 30575 1726867601.96724: variable 'omit' from source: magic vars 30575 1726867601.96727: we have included files to process 30575 1726867601.96729: generating all_blocks data 30575 1726867601.96730: done generating all_blocks data 30575 1726867601.96731: processing included file: fedora.linux_system_roles.network 30575 1726867601.96751: in VariableManager get_vars() 30575 1726867601.96765: done with get_vars() 30575 1726867601.96794: in VariableManager get_vars() 30575 1726867601.96811: done with get_vars() 30575 1726867601.96849: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867601.96970: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867601.97051: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867601.97473: in VariableManager get_vars() 30575 1726867601.97493: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867601.99319: iterating over new_blocks loaded from include file 30575 1726867601.99320: in VariableManager get_vars() 30575 1726867601.99334: done with get_vars() 30575 1726867601.99336: filtering new block on tags 30575 1726867601.99590: done filtering new block on tags 30575 1726867601.99594: in VariableManager get_vars() 30575 1726867601.99607: done with get_vars() 30575 1726867601.99609: filtering new block on tags 30575 1726867601.99625: done filtering new block on tags 30575 1726867601.99627: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867601.99632: extending task lists for all hosts with included blocks 30575 1726867601.99738: done extending task lists 30575 1726867601.99739: done processing included files 30575 1726867601.99740: results queue empty 30575 1726867601.99741: checking for any_errors_fatal 30575 1726867601.99744: done checking for any_errors_fatal 30575 1726867601.99745: checking for max_fail_percentage 30575 1726867601.99746: done checking for max_fail_percentage 30575 1726867601.99747: checking to see if all hosts have failed and the running result is not ok 30575 1726867601.99748: done checking to see if all hosts have failed 30575 1726867601.99748: getting the remaining hosts for this loop 30575 1726867601.99750: done getting the remaining hosts for this loop 30575 1726867601.99752: getting the next task for host managed_node3 30575 1726867601.99757: done getting next task for host managed_node3 30575 1726867601.99759: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867601.99763: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867601.99773: getting variables 30575 1726867601.99774: in VariableManager get_vars() 30575 1726867601.99788: Calling all_inventory to load vars for managed_node3 30575 1726867601.99791: Calling groups_inventory to load vars for managed_node3 30575 1726867601.99793: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867601.99799: Calling all_plugins_play to load vars for managed_node3 30575 1726867601.99801: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867601.99804: Calling groups_plugins_play to load vars for managed_node3 30575 1726867602.00894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867602.02474: done with get_vars() 30575 1726867602.02495: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:26:42 -0400 (0:00:00.108) 0:00:37.403 ****** 30575 1726867602.02567: entering _queue_task() for managed_node3/include_tasks 30575 1726867602.02916: worker is 1 (out of 1 available) 30575 1726867602.02929: exiting _queue_task() for managed_node3/include_tasks 30575 1726867602.02941: done queuing things up, now waiting for results queue to drain 30575 1726867602.02943: waiting for pending results... 30575 1726867602.03173: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867602.03322: in run() - task 0affcac9-a3a5-e081-a588-000000000d16 30575 1726867602.03340: variable 'ansible_search_path' from source: unknown 30575 1726867602.03344: variable 'ansible_search_path' from source: unknown 30575 1726867602.03374: calling self._execute() 30575 1726867602.03465: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867602.03469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867602.03481: variable 'omit' from source: magic vars 30575 1726867602.03890: variable 'ansible_distribution_major_version' from source: facts 30575 1726867602.03900: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867602.03907: _execute() done 30575 1726867602.03910: dumping result to json 30575 1726867602.03914: done dumping result, returning 30575 1726867602.03923: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-000000000d16] 30575 1726867602.03931: sending task result for task 0affcac9-a3a5-e081-a588-000000000d16 30575 1726867602.04022: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d16 30575 1726867602.04025: WORKER PROCESS EXITING 30575 1726867602.04093: no more pending results, returning what we have 30575 1726867602.04098: in VariableManager get_vars() 30575 1726867602.04137: Calling all_inventory to load vars for managed_node3 30575 1726867602.04140: Calling groups_inventory to load vars for managed_node3 30575 1726867602.04142: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867602.04153: Calling all_plugins_play to load vars for managed_node3 30575 1726867602.04155: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867602.04158: Calling groups_plugins_play to load vars for managed_node3 30575 1726867602.05727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867602.07370: done with get_vars() 30575 1726867602.07395: variable 'ansible_search_path' from source: unknown 30575 1726867602.07397: variable 'ansible_search_path' from source: unknown 30575 1726867602.07445: we have included files to process 30575 1726867602.07447: generating all_blocks data 30575 1726867602.07449: done generating all_blocks data 30575 1726867602.07453: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867602.07454: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867602.07456: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867602.08075: done processing included file 30575 1726867602.08079: iterating over new_blocks loaded from include file 30575 1726867602.08080: in VariableManager get_vars() 30575 1726867602.08102: done with get_vars() 30575 1726867602.08104: filtering new block on tags 30575 1726867602.08134: done filtering new block on tags 30575 1726867602.08137: in VariableManager get_vars() 30575 1726867602.08156: done with get_vars() 30575 1726867602.08158: filtering new block on tags 30575 1726867602.08203: done filtering new block on tags 30575 1726867602.08205: in VariableManager get_vars() 30575 1726867602.08225: done with get_vars() 30575 1726867602.08227: filtering new block on tags 30575 1726867602.08264: done filtering new block on tags 30575 1726867602.08267: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867602.08275: extending task lists for all hosts with included blocks 30575 1726867602.10050: done extending task lists 30575 1726867602.10052: done processing included files 30575 1726867602.10052: results queue empty 30575 1726867602.10053: checking for any_errors_fatal 30575 1726867602.10056: done checking for any_errors_fatal 30575 1726867602.10056: checking for max_fail_percentage 30575 1726867602.10057: done checking for max_fail_percentage 30575 1726867602.10058: checking to see if all hosts have failed and the running result is not ok 30575 1726867602.10063: done checking to see if all hosts have failed 30575 1726867602.10064: getting the remaining hosts for this loop 30575 1726867602.10065: done getting the remaining hosts for this loop 30575 1726867602.10068: getting the next task for host managed_node3 30575 1726867602.10073: done getting next task for host managed_node3 30575 1726867602.10075: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867602.10080: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867602.10089: getting variables 30575 1726867602.10089: in VariableManager get_vars() 30575 1726867602.10100: Calling all_inventory to load vars for managed_node3 30575 1726867602.10102: Calling groups_inventory to load vars for managed_node3 30575 1726867602.10104: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867602.10108: Calling all_plugins_play to load vars for managed_node3 30575 1726867602.10110: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867602.10112: Calling groups_plugins_play to load vars for managed_node3 30575 1726867602.11335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867602.13701: done with get_vars() 30575 1726867602.13807: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:26:42 -0400 (0:00:00.114) 0:00:37.517 ****** 30575 1726867602.13974: entering _queue_task() for managed_node3/setup 30575 1726867602.14653: worker is 1 (out of 1 available) 30575 1726867602.14667: exiting _queue_task() for managed_node3/setup 30575 1726867602.14683: done queuing things up, now waiting for results queue to drain 30575 1726867602.14685: waiting for pending results... 30575 1726867602.15339: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867602.15784: in run() - task 0affcac9-a3a5-e081-a588-000000000d6d 30575 1726867602.15788: variable 'ansible_search_path' from source: unknown 30575 1726867602.15791: variable 'ansible_search_path' from source: unknown 30575 1726867602.15802: calling self._execute() 30575 1726867602.15888: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867602.15892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867602.15903: variable 'omit' from source: magic vars 30575 1726867602.17188: variable 'ansible_distribution_major_version' from source: facts 30575 1726867602.17191: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867602.17472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867602.19907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867602.19971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867602.20015: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867602.20048: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867602.20072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867602.20151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867602.20182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867602.20212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867602.20248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867602.20260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867602.20321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867602.20379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867602.20415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867602.20455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867602.20480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867602.20674: variable '__network_required_facts' from source: role '' defaults 30575 1726867602.20685: variable 'ansible_facts' from source: unknown 30575 1726867602.21803: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867602.21807: when evaluation is False, skipping this task 30575 1726867602.21810: _execute() done 30575 1726867602.21812: dumping result to json 30575 1726867602.21814: done dumping result, returning 30575 1726867602.21818: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000000d6d] 30575 1726867602.21821: sending task result for task 0affcac9-a3a5-e081-a588-000000000d6d skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867602.21945: no more pending results, returning what we have 30575 1726867602.21949: results queue empty 30575 1726867602.21950: checking for any_errors_fatal 30575 1726867602.21952: done checking for any_errors_fatal 30575 1726867602.21953: checking for max_fail_percentage 30575 1726867602.21955: done checking for max_fail_percentage 30575 1726867602.21956: checking to see if all hosts have failed and the running result is not ok 30575 1726867602.21957: done checking to see if all hosts have failed 30575 1726867602.21958: getting the remaining hosts for this loop 30575 1726867602.21959: done getting the remaining hosts for this loop 30575 1726867602.21964: getting the next task for host managed_node3 30575 1726867602.21976: done getting next task for host managed_node3 30575 1726867602.21982: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867602.21991: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867602.22014: getting variables 30575 1726867602.22016: in VariableManager get_vars() 30575 1726867602.22057: Calling all_inventory to load vars for managed_node3 30575 1726867602.22059: Calling groups_inventory to load vars for managed_node3 30575 1726867602.22062: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867602.22073: Calling all_plugins_play to load vars for managed_node3 30575 1726867602.22078: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867602.22286: Calling groups_plugins_play to load vars for managed_node3 30575 1726867602.22849: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d6d 30575 1726867602.23985: WORKER PROCESS EXITING 30575 1726867602.26406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867602.29859: done with get_vars() 30575 1726867602.29889: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:26:42 -0400 (0:00:00.161) 0:00:37.678 ****** 30575 1726867602.30114: entering _queue_task() for managed_node3/stat 30575 1726867602.30948: worker is 1 (out of 1 available) 30575 1726867602.30963: exiting _queue_task() for managed_node3/stat 30575 1726867602.31034: done queuing things up, now waiting for results queue to drain 30575 1726867602.31037: waiting for pending results... 30575 1726867602.31386: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867602.31652: in run() - task 0affcac9-a3a5-e081-a588-000000000d6f 30575 1726867602.31679: variable 'ansible_search_path' from source: unknown 30575 1726867602.31683: variable 'ansible_search_path' from source: unknown 30575 1726867602.31728: calling self._execute() 30575 1726867602.31818: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867602.31826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867602.31844: variable 'omit' from source: magic vars 30575 1726867602.32382: variable 'ansible_distribution_major_version' from source: facts 30575 1726867602.32386: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867602.32404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867602.32679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867602.32722: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867602.32754: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867602.32788: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867602.32871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867602.32901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867602.32931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867602.32959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867602.33050: variable '__network_is_ostree' from source: set_fact 30575 1726867602.33057: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867602.33060: when evaluation is False, skipping this task 30575 1726867602.33063: _execute() done 30575 1726867602.33065: dumping result to json 30575 1726867602.33073: done dumping result, returning 30575 1726867602.33083: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000000d6f] 30575 1726867602.33089: sending task result for task 0affcac9-a3a5-e081-a588-000000000d6f 30575 1726867602.33285: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d6f 30575 1726867602.33292: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867602.33460: no more pending results, returning what we have 30575 1726867602.33463: results queue empty 30575 1726867602.33464: checking for any_errors_fatal 30575 1726867602.33469: done checking for any_errors_fatal 30575 1726867602.33470: checking for max_fail_percentage 30575 1726867602.33471: done checking for max_fail_percentage 30575 1726867602.33472: checking to see if all hosts have failed and the running result is not ok 30575 1726867602.33473: done checking to see if all hosts have failed 30575 1726867602.33474: getting the remaining hosts for this loop 30575 1726867602.33475: done getting the remaining hosts for this loop 30575 1726867602.33480: getting the next task for host managed_node3 30575 1726867602.33487: done getting next task for host managed_node3 30575 1726867602.33490: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867602.33495: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867602.33510: getting variables 30575 1726867602.33512: in VariableManager get_vars() 30575 1726867602.33548: Calling all_inventory to load vars for managed_node3 30575 1726867602.33551: Calling groups_inventory to load vars for managed_node3 30575 1726867602.33553: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867602.33563: Calling all_plugins_play to load vars for managed_node3 30575 1726867602.33567: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867602.33570: Calling groups_plugins_play to load vars for managed_node3 30575 1726867602.36084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867602.38107: done with get_vars() 30575 1726867602.38128: done getting variables 30575 1726867602.38301: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:26:42 -0400 (0:00:00.082) 0:00:37.760 ****** 30575 1726867602.38338: entering _queue_task() for managed_node3/set_fact 30575 1726867602.39075: worker is 1 (out of 1 available) 30575 1726867602.39090: exiting _queue_task() for managed_node3/set_fact 30575 1726867602.39104: done queuing things up, now waiting for results queue to drain 30575 1726867602.39105: waiting for pending results... 30575 1726867602.39493: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867602.39886: in run() - task 0affcac9-a3a5-e081-a588-000000000d70 30575 1726867602.39950: variable 'ansible_search_path' from source: unknown 30575 1726867602.39958: variable 'ansible_search_path' from source: unknown 30575 1726867602.40085: calling self._execute() 30575 1726867602.40200: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867602.40206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867602.40216: variable 'omit' from source: magic vars 30575 1726867602.41035: variable 'ansible_distribution_major_version' from source: facts 30575 1726867602.41046: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867602.41393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867602.42003: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867602.42050: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867602.42194: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867602.42282: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867602.42310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867602.42453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867602.42483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867602.42570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867602.42771: variable '__network_is_ostree' from source: set_fact 30575 1726867602.42780: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867602.42783: when evaluation is False, skipping this task 30575 1726867602.42786: _execute() done 30575 1726867602.42791: dumping result to json 30575 1726867602.42794: done dumping result, returning 30575 1726867602.42849: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000000d70] 30575 1726867602.42853: sending task result for task 0affcac9-a3a5-e081-a588-000000000d70 30575 1726867602.42924: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d70 30575 1726867602.42927: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867602.43001: no more pending results, returning what we have 30575 1726867602.43005: results queue empty 30575 1726867602.43006: checking for any_errors_fatal 30575 1726867602.43010: done checking for any_errors_fatal 30575 1726867602.43011: checking for max_fail_percentage 30575 1726867602.43012: done checking for max_fail_percentage 30575 1726867602.43013: checking to see if all hosts have failed and the running result is not ok 30575 1726867602.43014: done checking to see if all hosts have failed 30575 1726867602.43015: getting the remaining hosts for this loop 30575 1726867602.43016: done getting the remaining hosts for this loop 30575 1726867602.43019: getting the next task for host managed_node3 30575 1726867602.43030: done getting next task for host managed_node3 30575 1726867602.43034: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867602.43041: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867602.43084: getting variables 30575 1726867602.43086: in VariableManager get_vars() 30575 1726867602.43123: Calling all_inventory to load vars for managed_node3 30575 1726867602.43126: Calling groups_inventory to load vars for managed_node3 30575 1726867602.43128: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867602.43138: Calling all_plugins_play to load vars for managed_node3 30575 1726867602.43141: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867602.43143: Calling groups_plugins_play to load vars for managed_node3 30575 1726867602.45844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867602.49294: done with get_vars() 30575 1726867602.49394: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:26:42 -0400 (0:00:00.112) 0:00:37.873 ****** 30575 1726867602.49571: entering _queue_task() for managed_node3/service_facts 30575 1726867602.50325: worker is 1 (out of 1 available) 30575 1726867602.50337: exiting _queue_task() for managed_node3/service_facts 30575 1726867602.50349: done queuing things up, now waiting for results queue to drain 30575 1726867602.50351: waiting for pending results... 30575 1726867602.51297: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867602.51487: in run() - task 0affcac9-a3a5-e081-a588-000000000d72 30575 1726867602.51690: variable 'ansible_search_path' from source: unknown 30575 1726867602.51700: variable 'ansible_search_path' from source: unknown 30575 1726867602.51727: calling self._execute() 30575 1726867602.51816: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867602.51822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867602.51834: variable 'omit' from source: magic vars 30575 1726867602.52553: variable 'ansible_distribution_major_version' from source: facts 30575 1726867602.52564: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867602.52576: variable 'omit' from source: magic vars 30575 1726867602.52655: variable 'omit' from source: magic vars 30575 1726867602.52948: variable 'omit' from source: magic vars 30575 1726867602.52989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867602.53023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867602.53234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867602.53237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867602.53240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867602.53242: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867602.53245: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867602.53247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867602.53453: Set connection var ansible_pipelining to False 30575 1726867602.53456: Set connection var ansible_shell_type to sh 30575 1726867602.53586: Set connection var ansible_shell_executable to /bin/sh 30575 1726867602.53594: Set connection var ansible_timeout to 10 30575 1726867602.53599: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867602.53607: Set connection var ansible_connection to ssh 30575 1726867602.53636: variable 'ansible_shell_executable' from source: unknown 30575 1726867602.53640: variable 'ansible_connection' from source: unknown 30575 1726867602.53643: variable 'ansible_module_compression' from source: unknown 30575 1726867602.53645: variable 'ansible_shell_type' from source: unknown 30575 1726867602.53648: variable 'ansible_shell_executable' from source: unknown 30575 1726867602.53650: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867602.53652: variable 'ansible_pipelining' from source: unknown 30575 1726867602.53654: variable 'ansible_timeout' from source: unknown 30575 1726867602.53660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867602.53980: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867602.54208: variable 'omit' from source: magic vars 30575 1726867602.54212: starting attempt loop 30575 1726867602.54215: running the handler 30575 1726867602.54283: _low_level_execute_command(): starting 30575 1726867602.54286: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867602.55627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867602.55636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867602.55726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867602.55883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867602.55886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867602.55943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867602.57652: stdout chunk (state=3): >>>/root <<< 30575 1726867602.57898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867602.57902: stdout chunk (state=3): >>><<< 30575 1726867602.57904: stderr chunk (state=3): >>><<< 30575 1726867602.57909: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867602.57911: _low_level_execute_command(): starting 30575 1726867602.57915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002 `" && echo ansible-tmp-1726867602.5781252-32376-106248030986002="` echo /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002 `" ) && sleep 0' 30575 1726867602.58491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867602.58531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867602.58586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867602.58601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867602.58730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867602.59065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867602.59069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867602.59117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867602.61080: stdout chunk (state=3): >>>ansible-tmp-1726867602.5781252-32376-106248030986002=/root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002 <<< 30575 1726867602.61208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867602.61240: stderr chunk (state=3): >>><<< 30575 1726867602.61255: stdout chunk (state=3): >>><<< 30575 1726867602.61281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867602.5781252-32376-106248030986002=/root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867602.61346: variable 'ansible_module_compression' from source: unknown 30575 1726867602.61394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867602.61444: variable 'ansible_facts' from source: unknown 30575 1726867602.61543: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/AnsiballZ_service_facts.py 30575 1726867602.61690: Sending initial data 30575 1726867602.61790: Sent initial data (162 bytes) 30575 1726867602.62275: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867602.62363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867602.62495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867602.62585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867602.64182: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867602.64235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867602.64300: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpdh1ou_b5 /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/AnsiballZ_service_facts.py <<< 30575 1726867602.64321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/AnsiballZ_service_facts.py" <<< 30575 1726867602.64361: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpdh1ou_b5" to remote "/root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/AnsiballZ_service_facts.py" <<< 30575 1726867602.65172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867602.65175: stdout chunk (state=3): >>><<< 30575 1726867602.65187: stderr chunk (state=3): >>><<< 30575 1726867602.65231: done transferring module to remote 30575 1726867602.65351: _low_level_execute_command(): starting 30575 1726867602.65358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/ /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/AnsiballZ_service_facts.py && sleep 0' 30575 1726867602.65819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867602.65823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867602.65825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867602.65827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867602.65840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867602.65849: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867602.65893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867602.66023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867602.66026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867602.66072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867602.66292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867602.67972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867602.67975: stderr chunk (state=3): >>><<< 30575 1726867602.68001: stdout chunk (state=3): >>><<< 30575 1726867602.68005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867602.68010: _low_level_execute_command(): starting 30575 1726867602.68013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/AnsiballZ_service_facts.py && sleep 0' 30575 1726867602.68955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867602.68960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867602.68963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867602.68965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867602.68968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867602.68970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867602.69278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867602.69282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867602.69285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867602.69391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867604.19834: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30575 1726867604.19988: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30575 1726867604.19994: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 30575 1726867604.19997: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30575 1726867604.20001: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867604.21486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867604.21632: stderr chunk (state=3): >>><<< 30575 1726867604.21636: stdout chunk (state=3): >>><<< 30575 1726867604.21688: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867604.22587: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867604.22591: _low_level_execute_command(): starting 30575 1726867604.22593: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867602.5781252-32376-106248030986002/ > /dev/null 2>&1 && sleep 0' 30575 1726867604.23076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867604.23092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867604.23104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867604.23118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867604.23138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867604.23146: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867604.23270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867604.23273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867604.23276: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867604.23281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867604.23283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867604.23285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867604.23287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867604.23345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867604.23468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867604.23543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867604.25401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867604.25406: stdout chunk (state=3): >>><<< 30575 1726867604.25464: stderr chunk (state=3): >>><<< 30575 1726867604.25468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867604.25470: handler run complete 30575 1726867604.25835: variable 'ansible_facts' from source: unknown 30575 1726867604.26152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867604.27214: variable 'ansible_facts' from source: unknown 30575 1726867604.27462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867604.27925: attempt loop complete, returning result 30575 1726867604.27934: _execute() done 30575 1726867604.27937: dumping result to json 30575 1726867604.28282: done dumping result, returning 30575 1726867604.28286: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000000d72] 30575 1726867604.28288: sending task result for task 0affcac9-a3a5-e081-a588-000000000d72 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867604.29492: no more pending results, returning what we have 30575 1726867604.29501: results queue empty 30575 1726867604.29502: checking for any_errors_fatal 30575 1726867604.29509: done checking for any_errors_fatal 30575 1726867604.29510: checking for max_fail_percentage 30575 1726867604.29512: done checking for max_fail_percentage 30575 1726867604.29513: checking to see if all hosts have failed and the running result is not ok 30575 1726867604.29514: done checking to see if all hosts have failed 30575 1726867604.29514: getting the remaining hosts for this loop 30575 1726867604.29516: done getting the remaining hosts for this loop 30575 1726867604.29520: getting the next task for host managed_node3 30575 1726867604.29528: done getting next task for host managed_node3 30575 1726867604.29531: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867604.29537: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867604.29547: getting variables 30575 1726867604.29549: in VariableManager get_vars() 30575 1726867604.29718: Calling all_inventory to load vars for managed_node3 30575 1726867604.29721: Calling groups_inventory to load vars for managed_node3 30575 1726867604.29727: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d72 30575 1726867604.29737: WORKER PROCESS EXITING 30575 1726867604.29734: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867604.29748: Calling all_plugins_play to load vars for managed_node3 30575 1726867604.29751: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867604.29754: Calling groups_plugins_play to load vars for managed_node3 30575 1726867604.32923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867604.36932: done with get_vars() 30575 1726867604.36961: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:26:44 -0400 (0:00:01.876) 0:00:39.749 ****** 30575 1726867604.37184: entering _queue_task() for managed_node3/package_facts 30575 1726867604.38043: worker is 1 (out of 1 available) 30575 1726867604.38056: exiting _queue_task() for managed_node3/package_facts 30575 1726867604.38067: done queuing things up, now waiting for results queue to drain 30575 1726867604.38070: waiting for pending results... 30575 1726867604.38497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867604.39054: in run() - task 0affcac9-a3a5-e081-a588-000000000d73 30575 1726867604.39060: variable 'ansible_search_path' from source: unknown 30575 1726867604.39064: variable 'ansible_search_path' from source: unknown 30575 1726867604.39067: calling self._execute() 30575 1726867604.39124: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867604.39133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867604.39143: variable 'omit' from source: magic vars 30575 1726867604.40078: variable 'ansible_distribution_major_version' from source: facts 30575 1726867604.40091: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867604.40098: variable 'omit' from source: magic vars 30575 1726867604.40359: variable 'omit' from source: magic vars 30575 1726867604.40362: variable 'omit' from source: magic vars 30575 1726867604.40370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867604.40522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867604.40543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867604.40561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867604.40574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867604.40782: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867604.40786: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867604.40788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867604.41083: Set connection var ansible_pipelining to False 30575 1726867604.41086: Set connection var ansible_shell_type to sh 30575 1726867604.41089: Set connection var ansible_shell_executable to /bin/sh 30575 1726867604.41091: Set connection var ansible_timeout to 10 30575 1726867604.41093: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867604.41095: Set connection var ansible_connection to ssh 30575 1726867604.41097: variable 'ansible_shell_executable' from source: unknown 30575 1726867604.41099: variable 'ansible_connection' from source: unknown 30575 1726867604.41101: variable 'ansible_module_compression' from source: unknown 30575 1726867604.41103: variable 'ansible_shell_type' from source: unknown 30575 1726867604.41105: variable 'ansible_shell_executable' from source: unknown 30575 1726867604.41107: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867604.41109: variable 'ansible_pipelining' from source: unknown 30575 1726867604.41111: variable 'ansible_timeout' from source: unknown 30575 1726867604.41112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867604.41508: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867604.41518: variable 'omit' from source: magic vars 30575 1726867604.41523: starting attempt loop 30575 1726867604.41529: running the handler 30575 1726867604.41544: _low_level_execute_command(): starting 30575 1726867604.41556: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867604.43124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867604.43163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867604.43232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867604.43314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867604.45283: stdout chunk (state=3): >>>/root <<< 30575 1726867604.45287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867604.45289: stdout chunk (state=3): >>><<< 30575 1726867604.45292: stderr chunk (state=3): >>><<< 30575 1726867604.45295: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867604.45298: _low_level_execute_command(): starting 30575 1726867604.45300: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596 `" && echo ansible-tmp-1726867604.452734-32511-273335023160596="` echo /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596 `" ) && sleep 0' 30575 1726867604.46596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867604.46740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867604.46744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867604.46746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867604.46898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867604.48733: stdout chunk (state=3): >>>ansible-tmp-1726867604.452734-32511-273335023160596=/root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596 <<< 30575 1726867604.48839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867604.49004: stderr chunk (state=3): >>><<< 30575 1726867604.49013: stdout chunk (state=3): >>><<< 30575 1726867604.49114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867604.452734-32511-273335023160596=/root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867604.49118: variable 'ansible_module_compression' from source: unknown 30575 1726867604.49331: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867604.49396: variable 'ansible_facts' from source: unknown 30575 1726867604.49793: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/AnsiballZ_package_facts.py 30575 1726867604.50170: Sending initial data 30575 1726867604.50173: Sent initial data (161 bytes) 30575 1726867604.51251: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867604.51260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867604.51272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867604.51439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867604.51575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867604.51644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867604.53205: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867604.53214: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30575 1726867604.53233: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867604.53278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867604.53433: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpv804vu6_ /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/AnsiballZ_package_facts.py <<< 30575 1726867604.53437: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/AnsiballZ_package_facts.py" <<< 30575 1726867604.53451: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpv804vu6_" to remote "/root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/AnsiballZ_package_facts.py" <<< 30575 1726867604.57397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867604.57698: stderr chunk (state=3): >>><<< 30575 1726867604.57701: stdout chunk (state=3): >>><<< 30575 1726867604.57722: done transferring module to remote 30575 1726867604.57737: _low_level_execute_command(): starting 30575 1726867604.57740: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/ /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/AnsiballZ_package_facts.py && sleep 0' 30575 1726867604.58873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867604.59103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867604.59133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867604.59149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867604.59298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867604.61304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867604.61308: stdout chunk (state=3): >>><<< 30575 1726867604.61311: stderr chunk (state=3): >>><<< 30575 1726867604.61313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867604.61316: _low_level_execute_command(): starting 30575 1726867604.61319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/AnsiballZ_package_facts.py && sleep 0' 30575 1726867604.62673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867604.62784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867604.62900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867604.62980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867605.07147: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30575 1726867605.07168: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30575 1726867605.07320: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867605.09083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867605.09087: stdout chunk (state=3): >>><<< 30575 1726867605.09092: stderr chunk (state=3): >>><<< 30575 1726867605.09515: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867605.13307: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867605.13361: _low_level_execute_command(): starting 30575 1726867605.13365: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867604.452734-32511-273335023160596/ > /dev/null 2>&1 && sleep 0' 30575 1726867605.14664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867605.14710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867605.14798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867605.16678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867605.16682: stdout chunk (state=3): >>><<< 30575 1726867605.16689: stderr chunk (state=3): >>><<< 30575 1726867605.16729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867605.16732: handler run complete 30575 1726867605.18171: variable 'ansible_facts' from source: unknown 30575 1726867605.18763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.20651: variable 'ansible_facts' from source: unknown 30575 1726867605.21089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.22488: attempt loop complete, returning result 30575 1726867605.22493: _execute() done 30575 1726867605.22495: dumping result to json 30575 1726867605.23031: done dumping result, returning 30575 1726867605.23035: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000000d73] 30575 1726867605.23037: sending task result for task 0affcac9-a3a5-e081-a588-000000000d73 30575 1726867605.26189: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d73 30575 1726867605.26193: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867605.26350: no more pending results, returning what we have 30575 1726867605.26353: results queue empty 30575 1726867605.26354: checking for any_errors_fatal 30575 1726867605.26360: done checking for any_errors_fatal 30575 1726867605.26361: checking for max_fail_percentage 30575 1726867605.26362: done checking for max_fail_percentage 30575 1726867605.26363: checking to see if all hosts have failed and the running result is not ok 30575 1726867605.26364: done checking to see if all hosts have failed 30575 1726867605.26364: getting the remaining hosts for this loop 30575 1726867605.26366: done getting the remaining hosts for this loop 30575 1726867605.26379: getting the next task for host managed_node3 30575 1726867605.26387: done getting next task for host managed_node3 30575 1726867605.26391: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867605.26397: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867605.26408: getting variables 30575 1726867605.26409: in VariableManager get_vars() 30575 1726867605.26439: Calling all_inventory to load vars for managed_node3 30575 1726867605.26442: Calling groups_inventory to load vars for managed_node3 30575 1726867605.26444: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867605.26453: Calling all_plugins_play to load vars for managed_node3 30575 1726867605.26455: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867605.26458: Calling groups_plugins_play to load vars for managed_node3 30575 1726867605.27830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.29810: done with get_vars() 30575 1726867605.29835: done getting variables 30575 1726867605.29913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:26:45 -0400 (0:00:00.927) 0:00:40.677 ****** 30575 1726867605.29986: entering _queue_task() for managed_node3/debug 30575 1726867605.30619: worker is 1 (out of 1 available) 30575 1726867605.30896: exiting _queue_task() for managed_node3/debug 30575 1726867605.30910: done queuing things up, now waiting for results queue to drain 30575 1726867605.30912: waiting for pending results... 30575 1726867605.31054: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867605.31329: in run() - task 0affcac9-a3a5-e081-a588-000000000d17 30575 1726867605.31333: variable 'ansible_search_path' from source: unknown 30575 1726867605.31335: variable 'ansible_search_path' from source: unknown 30575 1726867605.31367: calling self._execute() 30575 1726867605.31548: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.31563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.31591: variable 'omit' from source: magic vars 30575 1726867605.32097: variable 'ansible_distribution_major_version' from source: facts 30575 1726867605.32100: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867605.32108: variable 'omit' from source: magic vars 30575 1726867605.32183: variable 'omit' from source: magic vars 30575 1726867605.32295: variable 'network_provider' from source: set_fact 30575 1726867605.32335: variable 'omit' from source: magic vars 30575 1726867605.32383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867605.32444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867605.32462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867605.32485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867605.32531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867605.32551: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867605.32561: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.32567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.32676: Set connection var ansible_pipelining to False 30575 1726867605.32749: Set connection var ansible_shell_type to sh 30575 1726867605.32752: Set connection var ansible_shell_executable to /bin/sh 30575 1726867605.32754: Set connection var ansible_timeout to 10 30575 1726867605.32756: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867605.32757: Set connection var ansible_connection to ssh 30575 1726867605.32759: variable 'ansible_shell_executable' from source: unknown 30575 1726867605.32763: variable 'ansible_connection' from source: unknown 30575 1726867605.32768: variable 'ansible_module_compression' from source: unknown 30575 1726867605.32770: variable 'ansible_shell_type' from source: unknown 30575 1726867605.32772: variable 'ansible_shell_executable' from source: unknown 30575 1726867605.32773: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.32780: variable 'ansible_pipelining' from source: unknown 30575 1726867605.32788: variable 'ansible_timeout' from source: unknown 30575 1726867605.32795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.32946: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867605.32968: variable 'omit' from source: magic vars 30575 1726867605.32982: starting attempt loop 30575 1726867605.32989: running the handler 30575 1726867605.33039: handler run complete 30575 1726867605.33091: attempt loop complete, returning result 30575 1726867605.33094: _execute() done 30575 1726867605.33096: dumping result to json 30575 1726867605.33098: done dumping result, returning 30575 1726867605.33298: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000000d17] 30575 1726867605.33308: sending task result for task 0affcac9-a3a5-e081-a588-000000000d17 30575 1726867605.33480: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d17 ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867605.33557: WORKER PROCESS EXITING 30575 1726867605.33574: no more pending results, returning what we have 30575 1726867605.33580: results queue empty 30575 1726867605.33581: checking for any_errors_fatal 30575 1726867605.33592: done checking for any_errors_fatal 30575 1726867605.33593: checking for max_fail_percentage 30575 1726867605.33595: done checking for max_fail_percentage 30575 1726867605.33596: checking to see if all hosts have failed and the running result is not ok 30575 1726867605.33597: done checking to see if all hosts have failed 30575 1726867605.33598: getting the remaining hosts for this loop 30575 1726867605.33600: done getting the remaining hosts for this loop 30575 1726867605.33603: getting the next task for host managed_node3 30575 1726867605.33613: done getting next task for host managed_node3 30575 1726867605.33617: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867605.33626: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867605.33648: getting variables 30575 1726867605.33650: in VariableManager get_vars() 30575 1726867605.33687: Calling all_inventory to load vars for managed_node3 30575 1726867605.33690: Calling groups_inventory to load vars for managed_node3 30575 1726867605.33692: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867605.33703: Calling all_plugins_play to load vars for managed_node3 30575 1726867605.33706: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867605.33709: Calling groups_plugins_play to load vars for managed_node3 30575 1726867605.37142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.41330: done with get_vars() 30575 1726867605.41360: done getting variables 30575 1726867605.41530: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:26:45 -0400 (0:00:00.117) 0:00:40.794 ****** 30575 1726867605.41697: entering _queue_task() for managed_node3/fail 30575 1726867605.42299: worker is 1 (out of 1 available) 30575 1726867605.42537: exiting _queue_task() for managed_node3/fail 30575 1726867605.42549: done queuing things up, now waiting for results queue to drain 30575 1726867605.42550: waiting for pending results... 30575 1726867605.43149: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867605.43274: in run() - task 0affcac9-a3a5-e081-a588-000000000d18 30575 1726867605.43681: variable 'ansible_search_path' from source: unknown 30575 1726867605.43686: variable 'ansible_search_path' from source: unknown 30575 1726867605.43689: calling self._execute() 30575 1726867605.43692: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.43695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.43697: variable 'omit' from source: magic vars 30575 1726867605.44053: variable 'ansible_distribution_major_version' from source: facts 30575 1726867605.44157: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867605.44304: variable 'network_state' from source: role '' defaults 30575 1726867605.44316: Evaluated conditional (network_state != {}): False 30575 1726867605.44320: when evaluation is False, skipping this task 30575 1726867605.44326: _execute() done 30575 1726867605.44329: dumping result to json 30575 1726867605.44332: done dumping result, returning 30575 1726867605.44335: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-000000000d18] 30575 1726867605.44341: sending task result for task 0affcac9-a3a5-e081-a588-000000000d18 30575 1726867605.44436: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d18 30575 1726867605.44440: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867605.44500: no more pending results, returning what we have 30575 1726867605.44505: results queue empty 30575 1726867605.44506: checking for any_errors_fatal 30575 1726867605.44512: done checking for any_errors_fatal 30575 1726867605.44513: checking for max_fail_percentage 30575 1726867605.44515: done checking for max_fail_percentage 30575 1726867605.44516: checking to see if all hosts have failed and the running result is not ok 30575 1726867605.44517: done checking to see if all hosts have failed 30575 1726867605.44518: getting the remaining hosts for this loop 30575 1726867605.44520: done getting the remaining hosts for this loop 30575 1726867605.44524: getting the next task for host managed_node3 30575 1726867605.44535: done getting next task for host managed_node3 30575 1726867605.44540: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867605.44546: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867605.44570: getting variables 30575 1726867605.44572: in VariableManager get_vars() 30575 1726867605.44752: Calling all_inventory to load vars for managed_node3 30575 1726867605.44755: Calling groups_inventory to load vars for managed_node3 30575 1726867605.44758: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867605.44769: Calling all_plugins_play to load vars for managed_node3 30575 1726867605.44772: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867605.44774: Calling groups_plugins_play to load vars for managed_node3 30575 1726867605.46438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.54536: done with get_vars() 30575 1726867605.54562: done getting variables 30575 1726867605.54618: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:26:45 -0400 (0:00:00.129) 0:00:40.924 ****** 30575 1726867605.54647: entering _queue_task() for managed_node3/fail 30575 1726867605.55017: worker is 1 (out of 1 available) 30575 1726867605.55037: exiting _queue_task() for managed_node3/fail 30575 1726867605.55050: done queuing things up, now waiting for results queue to drain 30575 1726867605.55052: waiting for pending results... 30575 1726867605.55356: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867605.55498: in run() - task 0affcac9-a3a5-e081-a588-000000000d19 30575 1726867605.55514: variable 'ansible_search_path' from source: unknown 30575 1726867605.55520: variable 'ansible_search_path' from source: unknown 30575 1726867605.55552: calling self._execute() 30575 1726867605.55644: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.55651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.55660: variable 'omit' from source: magic vars 30575 1726867605.56065: variable 'ansible_distribution_major_version' from source: facts 30575 1726867605.56092: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867605.56266: variable 'network_state' from source: role '' defaults 30575 1726867605.56285: Evaluated conditional (network_state != {}): False 30575 1726867605.56289: when evaluation is False, skipping this task 30575 1726867605.56294: _execute() done 30575 1726867605.56299: dumping result to json 30575 1726867605.56301: done dumping result, returning 30575 1726867605.56304: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-000000000d19] 30575 1726867605.56307: sending task result for task 0affcac9-a3a5-e081-a588-000000000d19 30575 1726867605.56504: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d19 30575 1726867605.56507: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867605.56572: no more pending results, returning what we have 30575 1726867605.56579: results queue empty 30575 1726867605.56582: checking for any_errors_fatal 30575 1726867605.56593: done checking for any_errors_fatal 30575 1726867605.56594: checking for max_fail_percentage 30575 1726867605.56595: done checking for max_fail_percentage 30575 1726867605.56596: checking to see if all hosts have failed and the running result is not ok 30575 1726867605.56598: done checking to see if all hosts have failed 30575 1726867605.56598: getting the remaining hosts for this loop 30575 1726867605.56601: done getting the remaining hosts for this loop 30575 1726867605.56605: getting the next task for host managed_node3 30575 1726867605.56613: done getting next task for host managed_node3 30575 1726867605.56617: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867605.56622: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867605.56646: getting variables 30575 1726867605.56648: in VariableManager get_vars() 30575 1726867605.56951: Calling all_inventory to load vars for managed_node3 30575 1726867605.56954: Calling groups_inventory to load vars for managed_node3 30575 1726867605.56957: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867605.56973: Calling all_plugins_play to load vars for managed_node3 30575 1726867605.56976: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867605.56984: Calling groups_plugins_play to load vars for managed_node3 30575 1726867605.59271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.60958: done with get_vars() 30575 1726867605.60987: done getting variables 30575 1726867605.61090: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:26:45 -0400 (0:00:00.064) 0:00:40.988 ****** 30575 1726867605.61126: entering _queue_task() for managed_node3/fail 30575 1726867605.62129: worker is 1 (out of 1 available) 30575 1726867605.62142: exiting _queue_task() for managed_node3/fail 30575 1726867605.62152: done queuing things up, now waiting for results queue to drain 30575 1726867605.62154: waiting for pending results... 30575 1726867605.62897: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867605.63178: in run() - task 0affcac9-a3a5-e081-a588-000000000d1a 30575 1726867605.63394: variable 'ansible_search_path' from source: unknown 30575 1726867605.63398: variable 'ansible_search_path' from source: unknown 30575 1726867605.63445: calling self._execute() 30575 1726867605.63770: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.63783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.63802: variable 'omit' from source: magic vars 30575 1726867605.64724: variable 'ansible_distribution_major_version' from source: facts 30575 1726867605.64740: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867605.65097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867605.70475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867605.71148: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867605.71369: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867605.71481: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867605.71512: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867605.71641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.71670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.71810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.71903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.71919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.72144: variable 'ansible_distribution_major_version' from source: facts 30575 1726867605.72157: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867605.72407: variable 'ansible_distribution' from source: facts 30575 1726867605.72410: variable '__network_rh_distros' from source: role '' defaults 30575 1726867605.72422: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867605.73034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.73183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.73186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.73236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.73249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.73435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.73438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.73485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.73605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.73609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.73657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.73889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.73917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.73952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.73967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.75485: variable 'network_connections' from source: include params 30575 1726867605.75489: variable 'interface' from source: play vars 30575 1726867605.75491: variable 'interface' from source: play vars 30575 1726867605.75493: variable 'network_state' from source: role '' defaults 30575 1726867605.75627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867605.75917: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867605.76091: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867605.76126: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867605.76297: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867605.76343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867605.76483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867605.76510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.76536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867605.76561: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867605.76570: when evaluation is False, skipping this task 30575 1726867605.76742: _execute() done 30575 1726867605.76745: dumping result to json 30575 1726867605.76747: done dumping result, returning 30575 1726867605.76750: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000000d1a] 30575 1726867605.76752: sending task result for task 0affcac9-a3a5-e081-a588-000000000d1a 30575 1726867605.76832: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d1a 30575 1726867605.76836: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867605.76890: no more pending results, returning what we have 30575 1726867605.76894: results queue empty 30575 1726867605.76895: checking for any_errors_fatal 30575 1726867605.76901: done checking for any_errors_fatal 30575 1726867605.76901: checking for max_fail_percentage 30575 1726867605.76904: done checking for max_fail_percentage 30575 1726867605.76905: checking to see if all hosts have failed and the running result is not ok 30575 1726867605.76906: done checking to see if all hosts have failed 30575 1726867605.76907: getting the remaining hosts for this loop 30575 1726867605.76908: done getting the remaining hosts for this loop 30575 1726867605.76912: getting the next task for host managed_node3 30575 1726867605.76922: done getting next task for host managed_node3 30575 1726867605.76926: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867605.76931: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867605.76953: getting variables 30575 1726867605.76955: in VariableManager get_vars() 30575 1726867605.76996: Calling all_inventory to load vars for managed_node3 30575 1726867605.76999: Calling groups_inventory to load vars for managed_node3 30575 1726867605.77002: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867605.77013: Calling all_plugins_play to load vars for managed_node3 30575 1726867605.77017: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867605.77020: Calling groups_plugins_play to load vars for managed_node3 30575 1726867605.81553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.84748: done with get_vars() 30575 1726867605.84770: done getting variables 30575 1726867605.84834: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:26:45 -0400 (0:00:00.237) 0:00:41.226 ****** 30575 1726867605.84878: entering _queue_task() for managed_node3/dnf 30575 1726867605.85234: worker is 1 (out of 1 available) 30575 1726867605.85248: exiting _queue_task() for managed_node3/dnf 30575 1726867605.85261: done queuing things up, now waiting for results queue to drain 30575 1726867605.85263: waiting for pending results... 30575 1726867605.86016: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867605.86641: in run() - task 0affcac9-a3a5-e081-a588-000000000d1b 30575 1726867605.86647: variable 'ansible_search_path' from source: unknown 30575 1726867605.86712: variable 'ansible_search_path' from source: unknown 30575 1726867605.86907: calling self._execute() 30575 1726867605.87047: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.87082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.87129: variable 'omit' from source: magic vars 30575 1726867605.87826: variable 'ansible_distribution_major_version' from source: facts 30575 1726867605.88081: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867605.88369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867605.90618: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867605.90717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867605.90766: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867605.90876: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867605.90881: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867605.90996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.91201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.91205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.91209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.91211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.91559: variable 'ansible_distribution' from source: facts 30575 1726867605.91569: variable 'ansible_distribution_major_version' from source: facts 30575 1726867605.91594: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867605.92050: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867605.92389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.92485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.92553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.92702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.92725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.92821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.92845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.92871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.92919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.92953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.92981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867605.93054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867605.93058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.93092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867605.93121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867605.93273: variable 'network_connections' from source: include params 30575 1726867605.93279: variable 'interface' from source: play vars 30575 1726867605.93336: variable 'interface' from source: play vars 30575 1726867605.93437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867605.93563: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867605.93594: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867605.93617: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867605.93645: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867605.93679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867605.93698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867605.93717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867605.93739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867605.93793: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867605.93942: variable 'network_connections' from source: include params 30575 1726867605.93945: variable 'interface' from source: play vars 30575 1726867605.93993: variable 'interface' from source: play vars 30575 1726867605.94011: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867605.94014: when evaluation is False, skipping this task 30575 1726867605.94017: _execute() done 30575 1726867605.94021: dumping result to json 30575 1726867605.94023: done dumping result, returning 30575 1726867605.94034: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000d1b] 30575 1726867605.94039: sending task result for task 0affcac9-a3a5-e081-a588-000000000d1b 30575 1726867605.94130: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d1b 30575 1726867605.94134: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867605.94186: no more pending results, returning what we have 30575 1726867605.94190: results queue empty 30575 1726867605.94191: checking for any_errors_fatal 30575 1726867605.94198: done checking for any_errors_fatal 30575 1726867605.94198: checking for max_fail_percentage 30575 1726867605.94200: done checking for max_fail_percentage 30575 1726867605.94201: checking to see if all hosts have failed and the running result is not ok 30575 1726867605.94202: done checking to see if all hosts have failed 30575 1726867605.94203: getting the remaining hosts for this loop 30575 1726867605.94204: done getting the remaining hosts for this loop 30575 1726867605.94208: getting the next task for host managed_node3 30575 1726867605.94215: done getting next task for host managed_node3 30575 1726867605.94219: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867605.94223: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867605.94245: getting variables 30575 1726867605.94246: in VariableManager get_vars() 30575 1726867605.94286: Calling all_inventory to load vars for managed_node3 30575 1726867605.94289: Calling groups_inventory to load vars for managed_node3 30575 1726867605.94291: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867605.94300: Calling all_plugins_play to load vars for managed_node3 30575 1726867605.94302: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867605.94305: Calling groups_plugins_play to load vars for managed_node3 30575 1726867605.95802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867605.97396: done with get_vars() 30575 1726867605.97422: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867605.97812: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:26:45 -0400 (0:00:00.129) 0:00:41.356 ****** 30575 1726867605.97846: entering _queue_task() for managed_node3/yum 30575 1726867605.98932: worker is 1 (out of 1 available) 30575 1726867605.98946: exiting _queue_task() for managed_node3/yum 30575 1726867605.98961: done queuing things up, now waiting for results queue to drain 30575 1726867605.98963: waiting for pending results... 30575 1726867605.99527: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867605.99628: in run() - task 0affcac9-a3a5-e081-a588-000000000d1c 30575 1726867605.99636: variable 'ansible_search_path' from source: unknown 30575 1726867605.99641: variable 'ansible_search_path' from source: unknown 30575 1726867605.99672: calling self._execute() 30575 1726867605.99752: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867605.99757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867605.99767: variable 'omit' from source: magic vars 30575 1726867606.00045: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.00057: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867606.00185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867606.02918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867606.03116: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867606.03120: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867606.03122: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867606.03131: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867606.03330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.03364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.03388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.03430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.03484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.03715: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.03719: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867606.03721: when evaluation is False, skipping this task 30575 1726867606.03726: _execute() done 30575 1726867606.03731: dumping result to json 30575 1726867606.03735: done dumping result, returning 30575 1726867606.03738: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000d1c] 30575 1726867606.03741: sending task result for task 0affcac9-a3a5-e081-a588-000000000d1c skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867606.03983: no more pending results, returning what we have 30575 1726867606.03986: results queue empty 30575 1726867606.03987: checking for any_errors_fatal 30575 1726867606.03992: done checking for any_errors_fatal 30575 1726867606.03993: checking for max_fail_percentage 30575 1726867606.03994: done checking for max_fail_percentage 30575 1726867606.03995: checking to see if all hosts have failed and the running result is not ok 30575 1726867606.03996: done checking to see if all hosts have failed 30575 1726867606.03997: getting the remaining hosts for this loop 30575 1726867606.03998: done getting the remaining hosts for this loop 30575 1726867606.04002: getting the next task for host managed_node3 30575 1726867606.04010: done getting next task for host managed_node3 30575 1726867606.04021: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867606.04028: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867606.04137: getting variables 30575 1726867606.04139: in VariableManager get_vars() 30575 1726867606.04172: Calling all_inventory to load vars for managed_node3 30575 1726867606.04174: Calling groups_inventory to load vars for managed_node3 30575 1726867606.04200: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867606.04207: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d1c 30575 1726867606.04210: WORKER PROCESS EXITING 30575 1726867606.04218: Calling all_plugins_play to load vars for managed_node3 30575 1726867606.04221: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867606.04226: Calling groups_plugins_play to load vars for managed_node3 30575 1726867606.06099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867606.08647: done with get_vars() 30575 1726867606.08670: done getting variables 30575 1726867606.08885: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:26:46 -0400 (0:00:00.111) 0:00:41.467 ****** 30575 1726867606.08982: entering _queue_task() for managed_node3/fail 30575 1726867606.09540: worker is 1 (out of 1 available) 30575 1726867606.09554: exiting _queue_task() for managed_node3/fail 30575 1726867606.09572: done queuing things up, now waiting for results queue to drain 30575 1726867606.09574: waiting for pending results... 30575 1726867606.10000: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867606.10203: in run() - task 0affcac9-a3a5-e081-a588-000000000d1d 30575 1726867606.10232: variable 'ansible_search_path' from source: unknown 30575 1726867606.10241: variable 'ansible_search_path' from source: unknown 30575 1726867606.10293: calling self._execute() 30575 1726867606.10400: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.10413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.10430: variable 'omit' from source: magic vars 30575 1726867606.10938: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.10942: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867606.11072: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867606.11287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867606.13009: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867606.13181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867606.13184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867606.13187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867606.13269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867606.13470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.13474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.13478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.13480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.13683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.13717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.13785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.13788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.13851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.13868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.13925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.13944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.13963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.13990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.14004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.14133: variable 'network_connections' from source: include params 30575 1726867606.14141: variable 'interface' from source: play vars 30575 1726867606.14191: variable 'interface' from source: play vars 30575 1726867606.14243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867606.14353: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867606.14391: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867606.14414: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867606.14439: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867606.14471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867606.14491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867606.14508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.14527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867606.14564: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867606.14719: variable 'network_connections' from source: include params 30575 1726867606.14723: variable 'interface' from source: play vars 30575 1726867606.14769: variable 'interface' from source: play vars 30575 1726867606.14789: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867606.14793: when evaluation is False, skipping this task 30575 1726867606.14796: _execute() done 30575 1726867606.14798: dumping result to json 30575 1726867606.14800: done dumping result, returning 30575 1726867606.14808: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000d1d] 30575 1726867606.14812: sending task result for task 0affcac9-a3a5-e081-a588-000000000d1d 30575 1726867606.14902: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d1d 30575 1726867606.14904: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867606.14951: no more pending results, returning what we have 30575 1726867606.14954: results queue empty 30575 1726867606.14955: checking for any_errors_fatal 30575 1726867606.14961: done checking for any_errors_fatal 30575 1726867606.14961: checking for max_fail_percentage 30575 1726867606.14963: done checking for max_fail_percentage 30575 1726867606.14964: checking to see if all hosts have failed and the running result is not ok 30575 1726867606.14964: done checking to see if all hosts have failed 30575 1726867606.14965: getting the remaining hosts for this loop 30575 1726867606.14966: done getting the remaining hosts for this loop 30575 1726867606.14970: getting the next task for host managed_node3 30575 1726867606.14980: done getting next task for host managed_node3 30575 1726867606.14983: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867606.14988: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867606.15007: getting variables 30575 1726867606.15008: in VariableManager get_vars() 30575 1726867606.15043: Calling all_inventory to load vars for managed_node3 30575 1726867606.15045: Calling groups_inventory to load vars for managed_node3 30575 1726867606.15047: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867606.15056: Calling all_plugins_play to load vars for managed_node3 30575 1726867606.15059: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867606.15061: Calling groups_plugins_play to load vars for managed_node3 30575 1726867606.15919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867606.17519: done with get_vars() 30575 1726867606.17535: done getting variables 30575 1726867606.17575: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:26:46 -0400 (0:00:00.086) 0:00:41.553 ****** 30575 1726867606.17601: entering _queue_task() for managed_node3/package 30575 1726867606.17925: worker is 1 (out of 1 available) 30575 1726867606.17939: exiting _queue_task() for managed_node3/package 30575 1726867606.17956: done queuing things up, now waiting for results queue to drain 30575 1726867606.17957: waiting for pending results... 30575 1726867606.18257: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867606.18366: in run() - task 0affcac9-a3a5-e081-a588-000000000d1e 30575 1726867606.18371: variable 'ansible_search_path' from source: unknown 30575 1726867606.18374: variable 'ansible_search_path' from source: unknown 30575 1726867606.18403: calling self._execute() 30575 1726867606.18502: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.18508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.18535: variable 'omit' from source: magic vars 30575 1726867606.18916: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.18920: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867606.19324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867606.19412: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867606.19454: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867606.19491: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867606.19572: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867606.19705: variable 'network_packages' from source: role '' defaults 30575 1726867606.19809: variable '__network_provider_setup' from source: role '' defaults 30575 1726867606.19812: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867606.19881: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867606.19894: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867606.19965: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867606.20166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867606.22061: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867606.22112: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867606.22147: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867606.22168: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867606.22191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867606.22266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.22290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.22307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.22336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.22347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.22384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.22403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.22420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.22447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.22457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.22605: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867606.22682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.22702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.22719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.22745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.22756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.22822: variable 'ansible_python' from source: facts 30575 1726867606.22838: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867606.22895: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867606.22967: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867606.23097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.23120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.23143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.23169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.23181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.23245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.23270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.23286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.23334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.23365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.23472: variable 'network_connections' from source: include params 30575 1726867606.23486: variable 'interface' from source: play vars 30575 1726867606.23621: variable 'interface' from source: play vars 30575 1726867606.23651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867606.23750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867606.23753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.23756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867606.23904: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867606.24264: variable 'network_connections' from source: include params 30575 1726867606.24267: variable 'interface' from source: play vars 30575 1726867606.24358: variable 'interface' from source: play vars 30575 1726867606.24453: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867606.24508: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867606.24726: variable 'network_connections' from source: include params 30575 1726867606.24729: variable 'interface' from source: play vars 30575 1726867606.24772: variable 'interface' from source: play vars 30575 1726867606.24791: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867606.24846: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867606.25038: variable 'network_connections' from source: include params 30575 1726867606.25042: variable 'interface' from source: play vars 30575 1726867606.25088: variable 'interface' from source: play vars 30575 1726867606.25128: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867606.25168: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867606.25174: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867606.25227: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867606.25380: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867606.25731: variable 'network_connections' from source: include params 30575 1726867606.25735: variable 'interface' from source: play vars 30575 1726867606.25802: variable 'interface' from source: play vars 30575 1726867606.25806: variable 'ansible_distribution' from source: facts 30575 1726867606.25811: variable '__network_rh_distros' from source: role '' defaults 30575 1726867606.25816: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.25828: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867606.25972: variable 'ansible_distribution' from source: facts 30575 1726867606.25978: variable '__network_rh_distros' from source: role '' defaults 30575 1726867606.25981: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.25994: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867606.26100: variable 'ansible_distribution' from source: facts 30575 1726867606.26103: variable '__network_rh_distros' from source: role '' defaults 30575 1726867606.26108: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.26138: variable 'network_provider' from source: set_fact 30575 1726867606.26150: variable 'ansible_facts' from source: unknown 30575 1726867606.26618: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867606.26623: when evaluation is False, skipping this task 30575 1726867606.26626: _execute() done 30575 1726867606.26629: dumping result to json 30575 1726867606.26633: done dumping result, returning 30575 1726867606.26636: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000000d1e] 30575 1726867606.26638: sending task result for task 0affcac9-a3a5-e081-a588-000000000d1e 30575 1726867606.26766: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d1e 30575 1726867606.26769: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867606.26878: no more pending results, returning what we have 30575 1726867606.26882: results queue empty 30575 1726867606.26883: checking for any_errors_fatal 30575 1726867606.26893: done checking for any_errors_fatal 30575 1726867606.26894: checking for max_fail_percentage 30575 1726867606.26897: done checking for max_fail_percentage 30575 1726867606.26898: checking to see if all hosts have failed and the running result is not ok 30575 1726867606.26899: done checking to see if all hosts have failed 30575 1726867606.26899: getting the remaining hosts for this loop 30575 1726867606.26903: done getting the remaining hosts for this loop 30575 1726867606.26908: getting the next task for host managed_node3 30575 1726867606.26917: done getting next task for host managed_node3 30575 1726867606.26923: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867606.26927: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867606.26944: getting variables 30575 1726867606.26947: in VariableManager get_vars() 30575 1726867606.26997: Calling all_inventory to load vars for managed_node3 30575 1726867606.27000: Calling groups_inventory to load vars for managed_node3 30575 1726867606.27002: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867606.27012: Calling all_plugins_play to load vars for managed_node3 30575 1726867606.27014: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867606.27017: Calling groups_plugins_play to load vars for managed_node3 30575 1726867606.28465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867606.30123: done with get_vars() 30575 1726867606.30151: done getting variables 30575 1726867606.30232: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:26:46 -0400 (0:00:00.126) 0:00:41.680 ****** 30575 1726867606.30280: entering _queue_task() for managed_node3/package 30575 1726867606.30799: worker is 1 (out of 1 available) 30575 1726867606.30811: exiting _queue_task() for managed_node3/package 30575 1726867606.30829: done queuing things up, now waiting for results queue to drain 30575 1726867606.30833: waiting for pending results... 30575 1726867606.31152: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867606.31157: in run() - task 0affcac9-a3a5-e081-a588-000000000d1f 30575 1726867606.31160: variable 'ansible_search_path' from source: unknown 30575 1726867606.31163: variable 'ansible_search_path' from source: unknown 30575 1726867606.31276: calling self._execute() 30575 1726867606.31304: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.31307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.31331: variable 'omit' from source: magic vars 30575 1726867606.31738: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.31746: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867606.31887: variable 'network_state' from source: role '' defaults 30575 1726867606.31895: Evaluated conditional (network_state != {}): False 30575 1726867606.31899: when evaluation is False, skipping this task 30575 1726867606.31902: _execute() done 30575 1726867606.31905: dumping result to json 30575 1726867606.31909: done dumping result, returning 30575 1726867606.31918: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000d1f] 30575 1726867606.31924: sending task result for task 0affcac9-a3a5-e081-a588-000000000d1f 30575 1726867606.32041: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d1f 30575 1726867606.32044: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867606.32102: no more pending results, returning what we have 30575 1726867606.32106: results queue empty 30575 1726867606.32107: checking for any_errors_fatal 30575 1726867606.32113: done checking for any_errors_fatal 30575 1726867606.32113: checking for max_fail_percentage 30575 1726867606.32114: done checking for max_fail_percentage 30575 1726867606.32115: checking to see if all hosts have failed and the running result is not ok 30575 1726867606.32116: done checking to see if all hosts have failed 30575 1726867606.32117: getting the remaining hosts for this loop 30575 1726867606.32118: done getting the remaining hosts for this loop 30575 1726867606.32121: getting the next task for host managed_node3 30575 1726867606.32129: done getting next task for host managed_node3 30575 1726867606.32133: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867606.32137: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867606.32156: getting variables 30575 1726867606.32158: in VariableManager get_vars() 30575 1726867606.32189: Calling all_inventory to load vars for managed_node3 30575 1726867606.32192: Calling groups_inventory to load vars for managed_node3 30575 1726867606.32194: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867606.32202: Calling all_plugins_play to load vars for managed_node3 30575 1726867606.32204: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867606.32206: Calling groups_plugins_play to load vars for managed_node3 30575 1726867606.33138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867606.34165: done with get_vars() 30575 1726867606.34185: done getting variables 30575 1726867606.34246: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:26:46 -0400 (0:00:00.039) 0:00:41.720 ****** 30575 1726867606.34270: entering _queue_task() for managed_node3/package 30575 1726867606.34589: worker is 1 (out of 1 available) 30575 1726867606.34602: exiting _queue_task() for managed_node3/package 30575 1726867606.34615: done queuing things up, now waiting for results queue to drain 30575 1726867606.34619: waiting for pending results... 30575 1726867606.35009: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867606.35235: in run() - task 0affcac9-a3a5-e081-a588-000000000d20 30575 1726867606.35261: variable 'ansible_search_path' from source: unknown 30575 1726867606.35265: variable 'ansible_search_path' from source: unknown 30575 1726867606.35328: calling self._execute() 30575 1726867606.35452: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.35457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.35505: variable 'omit' from source: magic vars 30575 1726867606.35932: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.35941: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867606.36070: variable 'network_state' from source: role '' defaults 30575 1726867606.36074: Evaluated conditional (network_state != {}): False 30575 1726867606.36079: when evaluation is False, skipping this task 30575 1726867606.36082: _execute() done 30575 1726867606.36084: dumping result to json 30575 1726867606.36168: done dumping result, returning 30575 1726867606.36175: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000000d20] 30575 1726867606.36180: sending task result for task 0affcac9-a3a5-e081-a588-000000000d20 30575 1726867606.36248: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d20 30575 1726867606.36250: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867606.36297: no more pending results, returning what we have 30575 1726867606.36301: results queue empty 30575 1726867606.36301: checking for any_errors_fatal 30575 1726867606.36305: done checking for any_errors_fatal 30575 1726867606.36306: checking for max_fail_percentage 30575 1726867606.36307: done checking for max_fail_percentage 30575 1726867606.36307: checking to see if all hosts have failed and the running result is not ok 30575 1726867606.36308: done checking to see if all hosts have failed 30575 1726867606.36309: getting the remaining hosts for this loop 30575 1726867606.36310: done getting the remaining hosts for this loop 30575 1726867606.36313: getting the next task for host managed_node3 30575 1726867606.36321: done getting next task for host managed_node3 30575 1726867606.36325: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867606.36329: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867606.36346: getting variables 30575 1726867606.36347: in VariableManager get_vars() 30575 1726867606.36380: Calling all_inventory to load vars for managed_node3 30575 1726867606.36383: Calling groups_inventory to load vars for managed_node3 30575 1726867606.36386: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867606.36393: Calling all_plugins_play to load vars for managed_node3 30575 1726867606.36396: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867606.36398: Calling groups_plugins_play to load vars for managed_node3 30575 1726867606.37547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867606.38713: done with get_vars() 30575 1726867606.38732: done getting variables 30575 1726867606.38808: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:26:46 -0400 (0:00:00.045) 0:00:41.765 ****** 30575 1726867606.38838: entering _queue_task() for managed_node3/service 30575 1726867606.39095: worker is 1 (out of 1 available) 30575 1726867606.39109: exiting _queue_task() for managed_node3/service 30575 1726867606.39124: done queuing things up, now waiting for results queue to drain 30575 1726867606.39125: waiting for pending results... 30575 1726867606.39357: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867606.39495: in run() - task 0affcac9-a3a5-e081-a588-000000000d21 30575 1726867606.39503: variable 'ansible_search_path' from source: unknown 30575 1726867606.39507: variable 'ansible_search_path' from source: unknown 30575 1726867606.39549: calling self._execute() 30575 1726867606.39641: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.39645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.39652: variable 'omit' from source: magic vars 30575 1726867606.39933: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.39937: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867606.40018: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867606.40179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867606.43288: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867606.43346: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867606.43401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867606.43429: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867606.43450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867606.43510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.43555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.43575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.43604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.43629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.43660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.43678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.43712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.43741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.43752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.43799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.43810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.43844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.43868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.43881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.43995: variable 'network_connections' from source: include params 30575 1726867606.44005: variable 'interface' from source: play vars 30575 1726867606.44052: variable 'interface' from source: play vars 30575 1726867606.44107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867606.44413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867606.44416: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867606.44424: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867606.44537: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867606.44590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867606.44654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867606.44699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.45266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867606.45606: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867606.46321: variable 'network_connections' from source: include params 30575 1726867606.46363: variable 'interface' from source: play vars 30575 1726867606.46561: variable 'interface' from source: play vars 30575 1726867606.46685: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867606.46688: when evaluation is False, skipping this task 30575 1726867606.46690: _execute() done 30575 1726867606.46692: dumping result to json 30575 1726867606.46694: done dumping result, returning 30575 1726867606.46698: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000000d21] 30575 1726867606.46700: sending task result for task 0affcac9-a3a5-e081-a588-000000000d21 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867606.47291: no more pending results, returning what we have 30575 1726867606.47296: results queue empty 30575 1726867606.47296: checking for any_errors_fatal 30575 1726867606.47303: done checking for any_errors_fatal 30575 1726867606.47304: checking for max_fail_percentage 30575 1726867606.47306: done checking for max_fail_percentage 30575 1726867606.47307: checking to see if all hosts have failed and the running result is not ok 30575 1726867606.47308: done checking to see if all hosts have failed 30575 1726867606.47309: getting the remaining hosts for this loop 30575 1726867606.47310: done getting the remaining hosts for this loop 30575 1726867606.47315: getting the next task for host managed_node3 30575 1726867606.47324: done getting next task for host managed_node3 30575 1726867606.47328: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867606.47333: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867606.47357: getting variables 30575 1726867606.47360: in VariableManager get_vars() 30575 1726867606.47402: Calling all_inventory to load vars for managed_node3 30575 1726867606.47406: Calling groups_inventory to load vars for managed_node3 30575 1726867606.47408: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867606.47419: Calling all_plugins_play to load vars for managed_node3 30575 1726867606.47423: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867606.47426: Calling groups_plugins_play to load vars for managed_node3 30575 1726867606.48123: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d21 30575 1726867606.48127: WORKER PROCESS EXITING 30575 1726867606.52433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867606.56693: done with get_vars() 30575 1726867606.56719: done getting variables 30575 1726867606.56891: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:26:46 -0400 (0:00:00.181) 0:00:41.947 ****** 30575 1726867606.57004: entering _queue_task() for managed_node3/service 30575 1726867606.58233: worker is 1 (out of 1 available) 30575 1726867606.58246: exiting _queue_task() for managed_node3/service 30575 1726867606.58259: done queuing things up, now waiting for results queue to drain 30575 1726867606.58260: waiting for pending results... 30575 1726867606.58793: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867606.59106: in run() - task 0affcac9-a3a5-e081-a588-000000000d22 30575 1726867606.59120: variable 'ansible_search_path' from source: unknown 30575 1726867606.59124: variable 'ansible_search_path' from source: unknown 30575 1726867606.59162: calling self._execute() 30575 1726867606.59256: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.59263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.59273: variable 'omit' from source: magic vars 30575 1726867606.60241: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.60251: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867606.60815: variable 'network_provider' from source: set_fact 30575 1726867606.60820: variable 'network_state' from source: role '' defaults 30575 1726867606.60835: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867606.60847: variable 'omit' from source: magic vars 30575 1726867606.61304: variable 'omit' from source: magic vars 30575 1726867606.61334: variable 'network_service_name' from source: role '' defaults 30575 1726867606.61400: variable 'network_service_name' from source: role '' defaults 30575 1726867606.61905: variable '__network_provider_setup' from source: role '' defaults 30575 1726867606.61911: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867606.61976: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867606.61986: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867606.62046: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867606.63045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867606.67985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867606.68038: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867606.68238: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867606.68241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867606.68245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867606.68249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.68251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.68346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.68350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.68352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.68367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.68455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.68458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.68461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.68465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.68689: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867606.68809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.68831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.68855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.68894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.69026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.69029: variable 'ansible_python' from source: facts 30575 1726867606.69032: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867606.69092: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867606.69167: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867606.69290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.69313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.69340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.69556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.69562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.69565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867606.69575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867606.69582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.69585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867606.69587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867606.69933: variable 'network_connections' from source: include params 30575 1726867606.69936: variable 'interface' from source: play vars 30575 1726867606.70027: variable 'interface' from source: play vars 30575 1726867606.70139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867606.70695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867606.70802: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867606.70872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867606.70917: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867606.71287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867606.71290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867606.71293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867606.71328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867606.71376: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867606.71785: variable 'network_connections' from source: include params 30575 1726867606.71792: variable 'interface' from source: play vars 30575 1726867606.71868: variable 'interface' from source: play vars 30575 1726867606.71905: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867606.71987: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867606.72303: variable 'network_connections' from source: include params 30575 1726867606.72307: variable 'interface' from source: play vars 30575 1726867606.72384: variable 'interface' from source: play vars 30575 1726867606.72408: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867606.72491: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867606.72786: variable 'network_connections' from source: include params 30575 1726867606.72790: variable 'interface' from source: play vars 30575 1726867606.72861: variable 'interface' from source: play vars 30575 1726867606.72921: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867606.72979: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867606.72988: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867606.73055: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867606.73274: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867606.73985: variable 'network_connections' from source: include params 30575 1726867606.73988: variable 'interface' from source: play vars 30575 1726867606.74231: variable 'interface' from source: play vars 30575 1726867606.74239: variable 'ansible_distribution' from source: facts 30575 1726867606.74241: variable '__network_rh_distros' from source: role '' defaults 30575 1726867606.74248: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.74261: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867606.74570: variable 'ansible_distribution' from source: facts 30575 1726867606.74573: variable '__network_rh_distros' from source: role '' defaults 30575 1726867606.74575: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.74634: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867606.75082: variable 'ansible_distribution' from source: facts 30575 1726867606.75086: variable '__network_rh_distros' from source: role '' defaults 30575 1726867606.75088: variable 'ansible_distribution_major_version' from source: facts 30575 1726867606.75090: variable 'network_provider' from source: set_fact 30575 1726867606.75092: variable 'omit' from source: magic vars 30575 1726867606.75094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867606.75383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867606.75386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867606.75389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867606.75391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867606.75482: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867606.75486: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.75488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.75513: Set connection var ansible_pipelining to False 30575 1726867606.75516: Set connection var ansible_shell_type to sh 30575 1726867606.75727: Set connection var ansible_shell_executable to /bin/sh 30575 1726867606.75731: Set connection var ansible_timeout to 10 30575 1726867606.75733: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867606.75735: Set connection var ansible_connection to ssh 30575 1726867606.75747: variable 'ansible_shell_executable' from source: unknown 30575 1726867606.75751: variable 'ansible_connection' from source: unknown 30575 1726867606.75754: variable 'ansible_module_compression' from source: unknown 30575 1726867606.75757: variable 'ansible_shell_type' from source: unknown 30575 1726867606.75760: variable 'ansible_shell_executable' from source: unknown 30575 1726867606.75762: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867606.75767: variable 'ansible_pipelining' from source: unknown 30575 1726867606.75769: variable 'ansible_timeout' from source: unknown 30575 1726867606.75774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867606.75996: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867606.76004: variable 'omit' from source: magic vars 30575 1726867606.76011: starting attempt loop 30575 1726867606.76013: running the handler 30575 1726867606.76271: variable 'ansible_facts' from source: unknown 30575 1726867606.77949: _low_level_execute_command(): starting 30575 1726867606.77952: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867606.80137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867606.80148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867606.80280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867606.80284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867606.80365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867606.80508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867606.80519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867606.80584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867606.82250: stdout chunk (state=3): >>>/root <<< 30575 1726867606.82403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867606.82538: stderr chunk (state=3): >>><<< 30575 1726867606.82557: stdout chunk (state=3): >>><<< 30575 1726867606.82648: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867606.82652: _low_level_execute_command(): starting 30575 1726867606.82661: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127 `" && echo ansible-tmp-1726867606.8259711-32605-94254598699127="` echo /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127 `" ) && sleep 0' 30575 1726867606.83905: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867606.83909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867606.83915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867606.83921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867606.84043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867606.84072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867606.84103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867606.84156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867606.86048: stdout chunk (state=3): >>>ansible-tmp-1726867606.8259711-32605-94254598699127=/root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127 <<< 30575 1726867606.86162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867606.86292: stderr chunk (state=3): >>><<< 30575 1726867606.86295: stdout chunk (state=3): >>><<< 30575 1726867606.86312: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867606.8259711-32605-94254598699127=/root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867606.86346: variable 'ansible_module_compression' from source: unknown 30575 1726867606.86399: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867606.86458: variable 'ansible_facts' from source: unknown 30575 1726867606.86981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/AnsiballZ_systemd.py 30575 1726867606.87566: Sending initial data 30575 1726867606.87569: Sent initial data (155 bytes) 30575 1726867606.88316: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867606.88325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867606.88339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867606.88360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867606.88365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867606.88379: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867606.88490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867606.88498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867606.88543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867606.90107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867606.90160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867606.90213: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpk27lej2_ /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/AnsiballZ_systemd.py <<< 30575 1726867606.90216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/AnsiballZ_systemd.py" <<< 30575 1726867606.90274: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpk27lej2_" to remote "/root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/AnsiballZ_systemd.py" <<< 30575 1726867606.92915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867606.92948: stderr chunk (state=3): >>><<< 30575 1726867606.92953: stdout chunk (state=3): >>><<< 30575 1726867606.93001: done transferring module to remote 30575 1726867606.93004: _low_level_execute_command(): starting 30575 1726867606.93006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/ /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/AnsiballZ_systemd.py && sleep 0' 30575 1726867606.93583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867606.93586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867606.93589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867606.93611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867606.93613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867606.93625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867606.93721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867606.93772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867606.95623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867606.95626: stdout chunk (state=3): >>><<< 30575 1726867606.95629: stderr chunk (state=3): >>><<< 30575 1726867606.95648: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867606.95650: _low_level_execute_command(): starting 30575 1726867606.95653: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/AnsiballZ_systemd.py && sleep 0' 30575 1726867606.96189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867606.96227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867606.96320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867607.25548: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10559488", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309187072", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1822447000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867607.27250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867607.27254: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867607.27269: stderr chunk (state=3): >>><<< 30575 1726867607.27272: stdout chunk (state=3): >>><<< 30575 1726867607.27291: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10559488", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309187072", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1822447000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867607.27413: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867607.27430: _low_level_execute_command(): starting 30575 1726867607.27439: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867606.8259711-32605-94254598699127/ > /dev/null 2>&1 && sleep 0' 30575 1726867607.27958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867607.27961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867607.27964: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867607.27966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867607.27968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867607.28028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867607.28032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867607.28099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867607.30184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867607.30188: stdout chunk (state=3): >>><<< 30575 1726867607.30190: stderr chunk (state=3): >>><<< 30575 1726867607.30193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867607.30195: handler run complete 30575 1726867607.30580: attempt loop complete, returning result 30575 1726867607.30584: _execute() done 30575 1726867607.30586: dumping result to json 30575 1726867607.30588: done dumping result, returning 30575 1726867607.30590: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000000d22] 30575 1726867607.30592: sending task result for task 0affcac9-a3a5-e081-a588-000000000d22 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867607.31937: no more pending results, returning what we have 30575 1726867607.31941: results queue empty 30575 1726867607.31942: checking for any_errors_fatal 30575 1726867607.31948: done checking for any_errors_fatal 30575 1726867607.31948: checking for max_fail_percentage 30575 1726867607.31950: done checking for max_fail_percentage 30575 1726867607.31951: checking to see if all hosts have failed and the running result is not ok 30575 1726867607.31952: done checking to see if all hosts have failed 30575 1726867607.31953: getting the remaining hosts for this loop 30575 1726867607.31954: done getting the remaining hosts for this loop 30575 1726867607.31958: getting the next task for host managed_node3 30575 1726867607.31967: done getting next task for host managed_node3 30575 1726867607.31971: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867607.31976: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867607.31992: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d22 30575 1726867607.31995: WORKER PROCESS EXITING 30575 1726867607.32185: getting variables 30575 1726867607.32187: in VariableManager get_vars() 30575 1726867607.32223: Calling all_inventory to load vars for managed_node3 30575 1726867607.32226: Calling groups_inventory to load vars for managed_node3 30575 1726867607.32228: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867607.32237: Calling all_plugins_play to load vars for managed_node3 30575 1726867607.32240: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867607.32243: Calling groups_plugins_play to load vars for managed_node3 30575 1726867607.36017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867607.38802: done with get_vars() 30575 1726867607.38825: done getting variables 30575 1726867607.38889: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:26:47 -0400 (0:00:00.819) 0:00:42.766 ****** 30575 1726867607.38935: entering _queue_task() for managed_node3/service 30575 1726867607.39310: worker is 1 (out of 1 available) 30575 1726867607.39322: exiting _queue_task() for managed_node3/service 30575 1726867607.39447: done queuing things up, now waiting for results queue to drain 30575 1726867607.39449: waiting for pending results... 30575 1726867607.39693: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867607.40252: in run() - task 0affcac9-a3a5-e081-a588-000000000d23 30575 1726867607.40312: variable 'ansible_search_path' from source: unknown 30575 1726867607.40316: variable 'ansible_search_path' from source: unknown 30575 1726867607.40325: calling self._execute() 30575 1726867607.40587: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867607.40593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867607.40603: variable 'omit' from source: magic vars 30575 1726867607.41285: variable 'ansible_distribution_major_version' from source: facts 30575 1726867607.41309: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867607.41516: variable 'network_provider' from source: set_fact 30575 1726867607.41519: Evaluated conditional (network_provider == "nm"): True 30575 1726867607.41542: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867607.41638: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867607.41811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867607.45074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867607.45270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867607.45418: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867607.45468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867607.45565: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867607.45713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867607.45851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867607.45892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867607.45939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867607.46018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867607.46218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867607.46221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867607.46244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867607.46436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867607.46439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867607.46472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867607.46505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867607.46621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867607.46671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867607.46748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867607.47015: variable 'network_connections' from source: include params 30575 1726867607.47069: variable 'interface' from source: play vars 30575 1726867607.47275: variable 'interface' from source: play vars 30575 1726867607.47447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867607.47815: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867607.47856: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867607.47897: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867607.47952: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867607.48104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867607.48132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867607.48167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867607.48233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867607.48333: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867607.48948: variable 'network_connections' from source: include params 30575 1726867607.49009: variable 'interface' from source: play vars 30575 1726867607.49032: variable 'interface' from source: play vars 30575 1726867607.49091: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867607.49172: when evaluation is False, skipping this task 30575 1726867607.49183: _execute() done 30575 1726867607.49191: dumping result to json 30575 1726867607.49198: done dumping result, returning 30575 1726867607.49210: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000000d23] 30575 1726867607.49235: sending task result for task 0affcac9-a3a5-e081-a588-000000000d23 30575 1726867607.49785: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d23 30575 1726867607.49789: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867607.49830: no more pending results, returning what we have 30575 1726867607.49834: results queue empty 30575 1726867607.49835: checking for any_errors_fatal 30575 1726867607.49861: done checking for any_errors_fatal 30575 1726867607.49862: checking for max_fail_percentage 30575 1726867607.49864: done checking for max_fail_percentage 30575 1726867607.49865: checking to see if all hosts have failed and the running result is not ok 30575 1726867607.49866: done checking to see if all hosts have failed 30575 1726867607.49867: getting the remaining hosts for this loop 30575 1726867607.49868: done getting the remaining hosts for this loop 30575 1726867607.49871: getting the next task for host managed_node3 30575 1726867607.49881: done getting next task for host managed_node3 30575 1726867607.49885: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867607.49894: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867607.49912: getting variables 30575 1726867607.49914: in VariableManager get_vars() 30575 1726867607.49951: Calling all_inventory to load vars for managed_node3 30575 1726867607.49954: Calling groups_inventory to load vars for managed_node3 30575 1726867607.49956: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867607.49966: Calling all_plugins_play to load vars for managed_node3 30575 1726867607.49969: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867607.49972: Calling groups_plugins_play to load vars for managed_node3 30575 1726867607.53545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867607.58332: done with get_vars() 30575 1726867607.58481: done getting variables 30575 1726867607.58542: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:26:47 -0400 (0:00:00.199) 0:00:42.966 ****** 30575 1726867607.58898: entering _queue_task() for managed_node3/service 30575 1726867607.59789: worker is 1 (out of 1 available) 30575 1726867607.59800: exiting _queue_task() for managed_node3/service 30575 1726867607.59812: done queuing things up, now waiting for results queue to drain 30575 1726867607.59813: waiting for pending results... 30575 1726867607.60571: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867607.61150: in run() - task 0affcac9-a3a5-e081-a588-000000000d24 30575 1726867607.61164: variable 'ansible_search_path' from source: unknown 30575 1726867607.61168: variable 'ansible_search_path' from source: unknown 30575 1726867607.61399: calling self._execute() 30575 1726867607.61783: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867607.61787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867607.61790: variable 'omit' from source: magic vars 30575 1726867607.62783: variable 'ansible_distribution_major_version' from source: facts 30575 1726867607.63000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867607.63117: variable 'network_provider' from source: set_fact 30575 1726867607.63127: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867607.63136: when evaluation is False, skipping this task 30575 1726867607.63139: _execute() done 30575 1726867607.63142: dumping result to json 30575 1726867607.63364: done dumping result, returning 30575 1726867607.63372: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000000d24] 30575 1726867607.63380: sending task result for task 0affcac9-a3a5-e081-a588-000000000d24 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867607.63562: no more pending results, returning what we have 30575 1726867607.63566: results queue empty 30575 1726867607.63569: checking for any_errors_fatal 30575 1726867607.63586: done checking for any_errors_fatal 30575 1726867607.63587: checking for max_fail_percentage 30575 1726867607.63589: done checking for max_fail_percentage 30575 1726867607.63589: checking to see if all hosts have failed and the running result is not ok 30575 1726867607.63590: done checking to see if all hosts have failed 30575 1726867607.63591: getting the remaining hosts for this loop 30575 1726867607.63593: done getting the remaining hosts for this loop 30575 1726867607.63596: getting the next task for host managed_node3 30575 1726867607.63606: done getting next task for host managed_node3 30575 1726867607.63611: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867607.63616: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867607.63639: getting variables 30575 1726867607.63641: in VariableManager get_vars() 30575 1726867607.63784: Calling all_inventory to load vars for managed_node3 30575 1726867607.63787: Calling groups_inventory to load vars for managed_node3 30575 1726867607.63791: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867607.63808: Calling all_plugins_play to load vars for managed_node3 30575 1726867607.63811: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867607.63814: Calling groups_plugins_play to load vars for managed_node3 30575 1726867607.64708: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d24 30575 1726867607.64712: WORKER PROCESS EXITING 30575 1726867607.66398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867607.69407: done with get_vars() 30575 1726867607.69435: done getting variables 30575 1726867607.69621: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:26:47 -0400 (0:00:00.107) 0:00:43.074 ****** 30575 1726867607.69775: entering _queue_task() for managed_node3/copy 30575 1726867607.70474: worker is 1 (out of 1 available) 30575 1726867607.70519: exiting _queue_task() for managed_node3/copy 30575 1726867607.70564: done queuing things up, now waiting for results queue to drain 30575 1726867607.70566: waiting for pending results... 30575 1726867607.71055: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867607.71786: in run() - task 0affcac9-a3a5-e081-a588-000000000d25 30575 1726867607.71791: variable 'ansible_search_path' from source: unknown 30575 1726867607.71794: variable 'ansible_search_path' from source: unknown 30575 1726867607.71797: calling self._execute() 30575 1726867607.72235: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867607.72239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867607.72241: variable 'omit' from source: magic vars 30575 1726867607.72871: variable 'ansible_distribution_major_version' from source: facts 30575 1726867607.72890: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867607.73195: variable 'network_provider' from source: set_fact 30575 1726867607.73211: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867607.73218: when evaluation is False, skipping this task 30575 1726867607.73225: _execute() done 30575 1726867607.73231: dumping result to json 30575 1726867607.73238: done dumping result, returning 30575 1726867607.73251: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000000d25] 30575 1726867607.73321: sending task result for task 0affcac9-a3a5-e081-a588-000000000d25 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867607.73473: no more pending results, returning what we have 30575 1726867607.73480: results queue empty 30575 1726867607.73481: checking for any_errors_fatal 30575 1726867607.73488: done checking for any_errors_fatal 30575 1726867607.73489: checking for max_fail_percentage 30575 1726867607.73491: done checking for max_fail_percentage 30575 1726867607.73492: checking to see if all hosts have failed and the running result is not ok 30575 1726867607.73493: done checking to see if all hosts have failed 30575 1726867607.73494: getting the remaining hosts for this loop 30575 1726867607.73495: done getting the remaining hosts for this loop 30575 1726867607.73499: getting the next task for host managed_node3 30575 1726867607.73508: done getting next task for host managed_node3 30575 1726867607.73512: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867607.73683: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867607.73709: getting variables 30575 1726867607.73712: in VariableManager get_vars() 30575 1726867607.73754: Calling all_inventory to load vars for managed_node3 30575 1726867607.73756: Calling groups_inventory to load vars for managed_node3 30575 1726867607.73758: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867607.73770: Calling all_plugins_play to load vars for managed_node3 30575 1726867607.73773: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867607.73776: Calling groups_plugins_play to load vars for managed_node3 30575 1726867607.75044: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d25 30575 1726867607.75047: WORKER PROCESS EXITING 30575 1726867607.76742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867607.78394: done with get_vars() 30575 1726867607.78421: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:26:47 -0400 (0:00:00.088) 0:00:43.162 ****** 30575 1726867607.78519: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867607.79176: worker is 1 (out of 1 available) 30575 1726867607.79294: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867607.79334: done queuing things up, now waiting for results queue to drain 30575 1726867607.79336: waiting for pending results... 30575 1726867607.79841: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867607.79997: in run() - task 0affcac9-a3a5-e081-a588-000000000d26 30575 1726867607.80020: variable 'ansible_search_path' from source: unknown 30575 1726867607.80029: variable 'ansible_search_path' from source: unknown 30575 1726867607.80082: calling self._execute() 30575 1726867607.80185: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867607.80199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867607.80215: variable 'omit' from source: magic vars 30575 1726867607.80596: variable 'ansible_distribution_major_version' from source: facts 30575 1726867607.80620: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867607.80632: variable 'omit' from source: magic vars 30575 1726867607.80697: variable 'omit' from source: magic vars 30575 1726867607.80936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867607.83607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867607.83681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867607.83850: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867607.83858: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867607.84194: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867607.84197: variable 'network_provider' from source: set_fact 30575 1726867607.84352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867607.84380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867607.84407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867607.84565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867607.84581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867607.84743: variable 'omit' from source: magic vars 30575 1726867607.84880: variable 'omit' from source: magic vars 30575 1726867607.84993: variable 'network_connections' from source: include params 30575 1726867607.85004: variable 'interface' from source: play vars 30575 1726867607.85069: variable 'interface' from source: play vars 30575 1726867607.85226: variable 'omit' from source: magic vars 30575 1726867607.85240: variable '__lsr_ansible_managed' from source: task vars 30575 1726867607.85302: variable '__lsr_ansible_managed' from source: task vars 30575 1726867607.85485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867607.85718: Loaded config def from plugin (lookup/template) 30575 1726867607.85724: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867607.85755: File lookup term: get_ansible_managed.j2 30575 1726867607.85758: variable 'ansible_search_path' from source: unknown 30575 1726867607.85761: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867607.85779: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867607.85797: variable 'ansible_search_path' from source: unknown 30575 1726867607.93389: variable 'ansible_managed' from source: unknown 30575 1726867607.93481: variable 'omit' from source: magic vars 30575 1726867607.93505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867607.93526: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867607.93545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867607.93558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867607.93566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867607.93591: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867607.93594: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867607.93597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867607.93664: Set connection var ansible_pipelining to False 30575 1726867607.93667: Set connection var ansible_shell_type to sh 30575 1726867607.93672: Set connection var ansible_shell_executable to /bin/sh 30575 1726867607.93679: Set connection var ansible_timeout to 10 30575 1726867607.93684: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867607.93691: Set connection var ansible_connection to ssh 30575 1726867607.93709: variable 'ansible_shell_executable' from source: unknown 30575 1726867607.93711: variable 'ansible_connection' from source: unknown 30575 1726867607.93714: variable 'ansible_module_compression' from source: unknown 30575 1726867607.93718: variable 'ansible_shell_type' from source: unknown 30575 1726867607.93721: variable 'ansible_shell_executable' from source: unknown 30575 1726867607.93724: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867607.93726: variable 'ansible_pipelining' from source: unknown 30575 1726867607.93728: variable 'ansible_timeout' from source: unknown 30575 1726867607.93730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867607.93823: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867607.93833: variable 'omit' from source: magic vars 30575 1726867607.93836: starting attempt loop 30575 1726867607.93839: running the handler 30575 1726867607.93849: _low_level_execute_command(): starting 30575 1726867607.93857: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867607.94332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867607.94336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867607.94339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867607.94393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867607.94397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867607.94457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867607.96148: stdout chunk (state=3): >>>/root <<< 30575 1726867607.96268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867607.96283: stderr chunk (state=3): >>><<< 30575 1726867607.96286: stdout chunk (state=3): >>><<< 30575 1726867607.96332: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867607.96338: _low_level_execute_command(): starting 30575 1726867607.96341: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290 `" && echo ansible-tmp-1726867607.9630923-32661-84720370068290="` echo /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290 `" ) && sleep 0' 30575 1726867607.96900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867607.96903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867607.96995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867607.97008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867607.97042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867607.98934: stdout chunk (state=3): >>>ansible-tmp-1726867607.9630923-32661-84720370068290=/root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290 <<< 30575 1726867607.99188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867607.99191: stdout chunk (state=3): >>><<< 30575 1726867607.99194: stderr chunk (state=3): >>><<< 30575 1726867607.99196: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867607.9630923-32661-84720370068290=/root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867607.99199: variable 'ansible_module_compression' from source: unknown 30575 1726867607.99201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867607.99239: variable 'ansible_facts' from source: unknown 30575 1726867607.99391: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/AnsiballZ_network_connections.py 30575 1726867607.99612: Sending initial data 30575 1726867607.99617: Sent initial data (167 bytes) 30575 1726867608.00200: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.00212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867608.00227: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.00272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867608.00294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.00334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.01871: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867608.01875: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867608.01919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867608.01966: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpfgkvqey_ /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/AnsiballZ_network_connections.py <<< 30575 1726867608.01969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/AnsiballZ_network_connections.py" <<< 30575 1726867608.02019: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpfgkvqey_" to remote "/root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/AnsiballZ_network_connections.py" <<< 30575 1726867608.03190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.03194: stderr chunk (state=3): >>><<< 30575 1726867608.03196: stdout chunk (state=3): >>><<< 30575 1726867608.03222: done transferring module to remote 30575 1726867608.03231: _low_level_execute_command(): starting 30575 1726867608.03236: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/ /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/AnsiballZ_network_connections.py && sleep 0' 30575 1726867608.03994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.04000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867608.04025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867608.04028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867608.04033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.04113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867608.04132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.04176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.05990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.05994: stdout chunk (state=3): >>><<< 30575 1726867608.05999: stderr chunk (state=3): >>><<< 30575 1726867608.06033: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867608.06036: _low_level_execute_command(): starting 30575 1726867608.06038: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/AnsiballZ_network_connections.py && sleep 0' 30575 1726867608.06750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.06756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.06759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.06762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.06764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867608.06800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.06846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.31519: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867608.33258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867608.33266: stdout chunk (state=3): >>><<< 30575 1726867608.33269: stderr chunk (state=3): >>><<< 30575 1726867608.33314: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867608.33336: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867608.33344: _low_level_execute_command(): starting 30575 1726867608.33349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867607.9630923-32661-84720370068290/ > /dev/null 2>&1 && sleep 0' 30575 1726867608.33933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.33936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.33939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867608.33941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.34000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867608.34007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.34047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.35873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.35896: stderr chunk (state=3): >>><<< 30575 1726867608.35899: stdout chunk (state=3): >>><<< 30575 1726867608.35912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867608.35918: handler run complete 30575 1726867608.35941: attempt loop complete, returning result 30575 1726867608.35944: _execute() done 30575 1726867608.35946: dumping result to json 30575 1726867608.35952: done dumping result, returning 30575 1726867608.35960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-000000000d26] 30575 1726867608.35966: sending task result for task 0affcac9-a3a5-e081-a588-000000000d26 30575 1726867608.36067: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d26 30575 1726867608.36069: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb skipped because already active 30575 1726867608.36165: no more pending results, returning what we have 30575 1726867608.36168: results queue empty 30575 1726867608.36169: checking for any_errors_fatal 30575 1726867608.36174: done checking for any_errors_fatal 30575 1726867608.36175: checking for max_fail_percentage 30575 1726867608.36176: done checking for max_fail_percentage 30575 1726867608.36179: checking to see if all hosts have failed and the running result is not ok 30575 1726867608.36180: done checking to see if all hosts have failed 30575 1726867608.36181: getting the remaining hosts for this loop 30575 1726867608.36182: done getting the remaining hosts for this loop 30575 1726867608.36185: getting the next task for host managed_node3 30575 1726867608.36193: done getting next task for host managed_node3 30575 1726867608.36196: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867608.36200: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867608.36211: getting variables 30575 1726867608.36212: in VariableManager get_vars() 30575 1726867608.36246: Calling all_inventory to load vars for managed_node3 30575 1726867608.36248: Calling groups_inventory to load vars for managed_node3 30575 1726867608.36250: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867608.36259: Calling all_plugins_play to load vars for managed_node3 30575 1726867608.36261: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867608.36264: Calling groups_plugins_play to load vars for managed_node3 30575 1726867608.37421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867608.38625: done with get_vars() 30575 1726867608.38642: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:26:48 -0400 (0:00:00.602) 0:00:43.764 ****** 30575 1726867608.38736: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867608.39066: worker is 1 (out of 1 available) 30575 1726867608.39076: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867608.39092: done queuing things up, now waiting for results queue to drain 30575 1726867608.39094: waiting for pending results... 30575 1726867608.39321: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867608.39421: in run() - task 0affcac9-a3a5-e081-a588-000000000d27 30575 1726867608.39436: variable 'ansible_search_path' from source: unknown 30575 1726867608.39441: variable 'ansible_search_path' from source: unknown 30575 1726867608.39470: calling self._execute() 30575 1726867608.39542: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.39551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.39561: variable 'omit' from source: magic vars 30575 1726867608.39844: variable 'ansible_distribution_major_version' from source: facts 30575 1726867608.39853: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867608.39944: variable 'network_state' from source: role '' defaults 30575 1726867608.39954: Evaluated conditional (network_state != {}): False 30575 1726867608.39957: when evaluation is False, skipping this task 30575 1726867608.39960: _execute() done 30575 1726867608.39962: dumping result to json 30575 1726867608.39965: done dumping result, returning 30575 1726867608.39973: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000000d27] 30575 1726867608.39980: sending task result for task 0affcac9-a3a5-e081-a588-000000000d27 30575 1726867608.40061: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d27 30575 1726867608.40065: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867608.40148: no more pending results, returning what we have 30575 1726867608.40151: results queue empty 30575 1726867608.40152: checking for any_errors_fatal 30575 1726867608.40159: done checking for any_errors_fatal 30575 1726867608.40160: checking for max_fail_percentage 30575 1726867608.40161: done checking for max_fail_percentage 30575 1726867608.40162: checking to see if all hosts have failed and the running result is not ok 30575 1726867608.40163: done checking to see if all hosts have failed 30575 1726867608.40164: getting the remaining hosts for this loop 30575 1726867608.40165: done getting the remaining hosts for this loop 30575 1726867608.40168: getting the next task for host managed_node3 30575 1726867608.40174: done getting next task for host managed_node3 30575 1726867608.40179: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867608.40184: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867608.40200: getting variables 30575 1726867608.40201: in VariableManager get_vars() 30575 1726867608.40228: Calling all_inventory to load vars for managed_node3 30575 1726867608.40231: Calling groups_inventory to load vars for managed_node3 30575 1726867608.40233: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867608.40241: Calling all_plugins_play to load vars for managed_node3 30575 1726867608.40243: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867608.40245: Calling groups_plugins_play to load vars for managed_node3 30575 1726867608.41119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867608.42593: done with get_vars() 30575 1726867608.42613: done getting variables 30575 1726867608.42669: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:26:48 -0400 (0:00:00.039) 0:00:43.804 ****** 30575 1726867608.42705: entering _queue_task() for managed_node3/debug 30575 1726867608.42974: worker is 1 (out of 1 available) 30575 1726867608.42988: exiting _queue_task() for managed_node3/debug 30575 1726867608.43001: done queuing things up, now waiting for results queue to drain 30575 1726867608.43002: waiting for pending results... 30575 1726867608.43497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867608.43645: in run() - task 0affcac9-a3a5-e081-a588-000000000d28 30575 1726867608.43649: variable 'ansible_search_path' from source: unknown 30575 1726867608.43652: variable 'ansible_search_path' from source: unknown 30575 1726867608.43655: calling self._execute() 30575 1726867608.43745: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.43753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.43790: variable 'omit' from source: magic vars 30575 1726867608.44131: variable 'ansible_distribution_major_version' from source: facts 30575 1726867608.44145: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867608.44166: variable 'omit' from source: magic vars 30575 1726867608.44200: variable 'omit' from source: magic vars 30575 1726867608.44227: variable 'omit' from source: magic vars 30575 1726867608.44259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867608.44293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867608.44307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867608.44330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867608.44336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867608.44358: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867608.44361: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.44364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.44437: Set connection var ansible_pipelining to False 30575 1726867608.44445: Set connection var ansible_shell_type to sh 30575 1726867608.44448: Set connection var ansible_shell_executable to /bin/sh 30575 1726867608.44451: Set connection var ansible_timeout to 10 30575 1726867608.44453: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867608.44461: Set connection var ansible_connection to ssh 30575 1726867608.44479: variable 'ansible_shell_executable' from source: unknown 30575 1726867608.44482: variable 'ansible_connection' from source: unknown 30575 1726867608.44485: variable 'ansible_module_compression' from source: unknown 30575 1726867608.44488: variable 'ansible_shell_type' from source: unknown 30575 1726867608.44490: variable 'ansible_shell_executable' from source: unknown 30575 1726867608.44492: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.44494: variable 'ansible_pipelining' from source: unknown 30575 1726867608.44498: variable 'ansible_timeout' from source: unknown 30575 1726867608.44502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.44606: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867608.44617: variable 'omit' from source: magic vars 30575 1726867608.44620: starting attempt loop 30575 1726867608.44623: running the handler 30575 1726867608.44717: variable '__network_connections_result' from source: set_fact 30575 1726867608.44759: handler run complete 30575 1726867608.44773: attempt loop complete, returning result 30575 1726867608.44776: _execute() done 30575 1726867608.44780: dumping result to json 30575 1726867608.44783: done dumping result, returning 30575 1726867608.44792: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-000000000d28] 30575 1726867608.44797: sending task result for task 0affcac9-a3a5-e081-a588-000000000d28 30575 1726867608.44878: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d28 30575 1726867608.44881: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb skipped because already active" ] } 30575 1726867608.44950: no more pending results, returning what we have 30575 1726867608.44953: results queue empty 30575 1726867608.44954: checking for any_errors_fatal 30575 1726867608.44959: done checking for any_errors_fatal 30575 1726867608.44960: checking for max_fail_percentage 30575 1726867608.44961: done checking for max_fail_percentage 30575 1726867608.44962: checking to see if all hosts have failed and the running result is not ok 30575 1726867608.44963: done checking to see if all hosts have failed 30575 1726867608.44964: getting the remaining hosts for this loop 30575 1726867608.44965: done getting the remaining hosts for this loop 30575 1726867608.44968: getting the next task for host managed_node3 30575 1726867608.44975: done getting next task for host managed_node3 30575 1726867608.44981: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867608.44986: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867608.44997: getting variables 30575 1726867608.44999: in VariableManager get_vars() 30575 1726867608.45028: Calling all_inventory to load vars for managed_node3 30575 1726867608.45030: Calling groups_inventory to load vars for managed_node3 30575 1726867608.45032: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867608.45040: Calling all_plugins_play to load vars for managed_node3 30575 1726867608.45042: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867608.45044: Calling groups_plugins_play to load vars for managed_node3 30575 1726867608.45797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867608.47166: done with get_vars() 30575 1726867608.47187: done getting variables 30575 1726867608.47243: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:26:48 -0400 (0:00:00.045) 0:00:43.850 ****** 30575 1726867608.47283: entering _queue_task() for managed_node3/debug 30575 1726867608.47541: worker is 1 (out of 1 available) 30575 1726867608.47553: exiting _queue_task() for managed_node3/debug 30575 1726867608.47565: done queuing things up, now waiting for results queue to drain 30575 1726867608.47567: waiting for pending results... 30575 1726867608.48031: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867608.48035: in run() - task 0affcac9-a3a5-e081-a588-000000000d29 30575 1726867608.48038: variable 'ansible_search_path' from source: unknown 30575 1726867608.48041: variable 'ansible_search_path' from source: unknown 30575 1726867608.48051: calling self._execute() 30575 1726867608.48150: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.48156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.48166: variable 'omit' from source: magic vars 30575 1726867608.48548: variable 'ansible_distribution_major_version' from source: facts 30575 1726867608.48560: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867608.48567: variable 'omit' from source: magic vars 30575 1726867608.48671: variable 'omit' from source: magic vars 30575 1726867608.48675: variable 'omit' from source: magic vars 30575 1726867608.48712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867608.48750: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867608.48772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867608.48791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867608.48803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867608.48838: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867608.48841: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.48844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.48953: Set connection var ansible_pipelining to False 30575 1726867608.48956: Set connection var ansible_shell_type to sh 30575 1726867608.48966: Set connection var ansible_shell_executable to /bin/sh 30575 1726867608.48968: Set connection var ansible_timeout to 10 30575 1726867608.48971: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867608.48986: Set connection var ansible_connection to ssh 30575 1726867608.49011: variable 'ansible_shell_executable' from source: unknown 30575 1726867608.49014: variable 'ansible_connection' from source: unknown 30575 1726867608.49017: variable 'ansible_module_compression' from source: unknown 30575 1726867608.49019: variable 'ansible_shell_type' from source: unknown 30575 1726867608.49024: variable 'ansible_shell_executable' from source: unknown 30575 1726867608.49026: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.49031: variable 'ansible_pipelining' from source: unknown 30575 1726867608.49033: variable 'ansible_timeout' from source: unknown 30575 1726867608.49037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.49176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867608.49187: variable 'omit' from source: magic vars 30575 1726867608.49195: starting attempt loop 30575 1726867608.49198: running the handler 30575 1726867608.49247: variable '__network_connections_result' from source: set_fact 30575 1726867608.49326: variable '__network_connections_result' from source: set_fact 30575 1726867608.49432: handler run complete 30575 1726867608.49454: attempt loop complete, returning result 30575 1726867608.49457: _execute() done 30575 1726867608.49459: dumping result to json 30575 1726867608.49461: done dumping result, returning 30575 1726867608.49472: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-000000000d29] 30575 1726867608.49478: sending task result for task 0affcac9-a3a5-e081-a588-000000000d29 30575 1726867608.49575: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d29 30575 1726867608.49580: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ade586ae-171f-45bd-a4ea-cde3464255eb skipped because already active" ] } } 30575 1726867608.49684: no more pending results, returning what we have 30575 1726867608.49687: results queue empty 30575 1726867608.49688: checking for any_errors_fatal 30575 1726867608.49694: done checking for any_errors_fatal 30575 1726867608.49694: checking for max_fail_percentage 30575 1726867608.49696: done checking for max_fail_percentage 30575 1726867608.49697: checking to see if all hosts have failed and the running result is not ok 30575 1726867608.49698: done checking to see if all hosts have failed 30575 1726867608.49699: getting the remaining hosts for this loop 30575 1726867608.49701: done getting the remaining hosts for this loop 30575 1726867608.49705: getting the next task for host managed_node3 30575 1726867608.49713: done getting next task for host managed_node3 30575 1726867608.49720: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867608.49725: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867608.49737: getting variables 30575 1726867608.49739: in VariableManager get_vars() 30575 1726867608.49774: Calling all_inventory to load vars for managed_node3 30575 1726867608.49776: Calling groups_inventory to load vars for managed_node3 30575 1726867608.49787: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867608.49798: Calling all_plugins_play to load vars for managed_node3 30575 1726867608.49801: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867608.49805: Calling groups_plugins_play to load vars for managed_node3 30575 1726867608.51285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867608.52150: done with get_vars() 30575 1726867608.52164: done getting variables 30575 1726867608.52205: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:26:48 -0400 (0:00:00.049) 0:00:43.899 ****** 30575 1726867608.52231: entering _queue_task() for managed_node3/debug 30575 1726867608.52431: worker is 1 (out of 1 available) 30575 1726867608.52444: exiting _queue_task() for managed_node3/debug 30575 1726867608.52457: done queuing things up, now waiting for results queue to drain 30575 1726867608.52459: waiting for pending results... 30575 1726867608.52646: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867608.52889: in run() - task 0affcac9-a3a5-e081-a588-000000000d2a 30575 1726867608.52893: variable 'ansible_search_path' from source: unknown 30575 1726867608.52896: variable 'ansible_search_path' from source: unknown 30575 1726867608.52898: calling self._execute() 30575 1726867608.52981: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.53002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.53029: variable 'omit' from source: magic vars 30575 1726867608.53498: variable 'ansible_distribution_major_version' from source: facts 30575 1726867608.53504: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867608.53712: variable 'network_state' from source: role '' defaults 30575 1726867608.53734: Evaluated conditional (network_state != {}): False 30575 1726867608.53738: when evaluation is False, skipping this task 30575 1726867608.53740: _execute() done 30575 1726867608.53743: dumping result to json 30575 1726867608.53745: done dumping result, returning 30575 1726867608.53748: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000000d2a] 30575 1726867608.53755: sending task result for task 0affcac9-a3a5-e081-a588-000000000d2a 30575 1726867608.53851: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d2a 30575 1726867608.53856: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867608.53910: no more pending results, returning what we have 30575 1726867608.53914: results queue empty 30575 1726867608.53915: checking for any_errors_fatal 30575 1726867608.53928: done checking for any_errors_fatal 30575 1726867608.53929: checking for max_fail_percentage 30575 1726867608.53931: done checking for max_fail_percentage 30575 1726867608.53932: checking to see if all hosts have failed and the running result is not ok 30575 1726867608.53933: done checking to see if all hosts have failed 30575 1726867608.53933: getting the remaining hosts for this loop 30575 1726867608.53935: done getting the remaining hosts for this loop 30575 1726867608.53939: getting the next task for host managed_node3 30575 1726867608.53946: done getting next task for host managed_node3 30575 1726867608.53951: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867608.53955: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867608.53970: getting variables 30575 1726867608.53972: in VariableManager get_vars() 30575 1726867608.54001: Calling all_inventory to load vars for managed_node3 30575 1726867608.54004: Calling groups_inventory to load vars for managed_node3 30575 1726867608.54006: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867608.54013: Calling all_plugins_play to load vars for managed_node3 30575 1726867608.54015: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867608.54020: Calling groups_plugins_play to load vars for managed_node3 30575 1726867608.54763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867608.55631: done with get_vars() 30575 1726867608.55652: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:26:48 -0400 (0:00:00.035) 0:00:43.934 ****** 30575 1726867608.55761: entering _queue_task() for managed_node3/ping 30575 1726867608.56043: worker is 1 (out of 1 available) 30575 1726867608.56055: exiting _queue_task() for managed_node3/ping 30575 1726867608.56068: done queuing things up, now waiting for results queue to drain 30575 1726867608.56070: waiting for pending results... 30575 1726867608.56356: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867608.56400: in run() - task 0affcac9-a3a5-e081-a588-000000000d2b 30575 1726867608.56413: variable 'ansible_search_path' from source: unknown 30575 1726867608.56417: variable 'ansible_search_path' from source: unknown 30575 1726867608.56454: calling self._execute() 30575 1726867608.56550: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.56553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.56560: variable 'omit' from source: magic vars 30575 1726867608.56920: variable 'ansible_distribution_major_version' from source: facts 30575 1726867608.56934: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867608.56940: variable 'omit' from source: magic vars 30575 1726867608.57002: variable 'omit' from source: magic vars 30575 1726867608.57037: variable 'omit' from source: magic vars 30575 1726867608.57075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867608.57109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867608.57133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867608.57152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867608.57163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867608.57202: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867608.57205: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.57207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.57300: Set connection var ansible_pipelining to False 30575 1726867608.57309: Set connection var ansible_shell_type to sh 30575 1726867608.57325: Set connection var ansible_shell_executable to /bin/sh 30575 1726867608.57336: Set connection var ansible_timeout to 10 30575 1726867608.57348: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867608.57364: Set connection var ansible_connection to ssh 30575 1726867608.57405: variable 'ansible_shell_executable' from source: unknown 30575 1726867608.57408: variable 'ansible_connection' from source: unknown 30575 1726867608.57425: variable 'ansible_module_compression' from source: unknown 30575 1726867608.57431: variable 'ansible_shell_type' from source: unknown 30575 1726867608.57434: variable 'ansible_shell_executable' from source: unknown 30575 1726867608.57436: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867608.57438: variable 'ansible_pipelining' from source: unknown 30575 1726867608.57440: variable 'ansible_timeout' from source: unknown 30575 1726867608.57442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867608.57587: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867608.57596: variable 'omit' from source: magic vars 30575 1726867608.57601: starting attempt loop 30575 1726867608.57603: running the handler 30575 1726867608.57614: _low_level_execute_command(): starting 30575 1726867608.57624: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867608.58118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.58122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.58125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867608.58128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.58182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867608.58186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867608.58188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.58237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.59940: stdout chunk (state=3): >>>/root <<< 30575 1726867608.60182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.60185: stdout chunk (state=3): >>><<< 30575 1726867608.60188: stderr chunk (state=3): >>><<< 30575 1726867608.60191: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867608.60193: _low_level_execute_command(): starting 30575 1726867608.60195: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518 `" && echo ansible-tmp-1726867608.600946-32695-96532750044518="` echo /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518 `" ) && sleep 0' 30575 1726867608.60638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.60644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.60670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867608.60673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.60726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867608.60729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.60789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.62732: stdout chunk (state=3): >>>ansible-tmp-1726867608.600946-32695-96532750044518=/root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518 <<< 30575 1726867608.62905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.62908: stdout chunk (state=3): >>><<< 30575 1726867608.62910: stderr chunk (state=3): >>><<< 30575 1726867608.63125: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867608.600946-32695-96532750044518=/root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867608.63129: variable 'ansible_module_compression' from source: unknown 30575 1726867608.63131: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867608.63133: variable 'ansible_facts' from source: unknown 30575 1726867608.63135: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/AnsiballZ_ping.py 30575 1726867608.63468: Sending initial data 30575 1726867608.63471: Sent initial data (151 bytes) 30575 1726867608.64040: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867608.64082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.64200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.64206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867608.64259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.64298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.65921: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30575 1726867608.65925: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867608.66001: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867608.66008: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpoji88n1v /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/AnsiballZ_ping.py <<< 30575 1726867608.66013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/AnsiballZ_ping.py" <<< 30575 1726867608.66076: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpoji88n1v" to remote "/root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/AnsiballZ_ping.py" <<< 30575 1726867608.66988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.67154: stderr chunk (state=3): >>><<< 30575 1726867608.67197: stdout chunk (state=3): >>><<< 30575 1726867608.67332: done transferring module to remote 30575 1726867608.67420: _low_level_execute_command(): starting 30575 1726867608.67424: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/ /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/AnsiballZ_ping.py && sleep 0' 30575 1726867608.68199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.68246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867608.68256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.68302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.70053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.70073: stderr chunk (state=3): >>><<< 30575 1726867608.70076: stdout chunk (state=3): >>><<< 30575 1726867608.70103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867608.70106: _low_level_execute_command(): starting 30575 1726867608.70112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/AnsiballZ_ping.py && sleep 0' 30575 1726867608.70606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.70610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.70612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867608.70614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.70663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867608.70671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.70713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.85805: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867608.87384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867608.87387: stdout chunk (state=3): >>><<< 30575 1726867608.87392: stderr chunk (state=3): >>><<< 30575 1726867608.87395: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867608.87432: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867608.87435: _low_level_execute_command(): starting 30575 1726867608.87438: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867608.600946-32695-96532750044518/ > /dev/null 2>&1 && sleep 0' 30575 1726867608.87979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867608.87984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867608.87995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.88008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867608.88022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867608.88088: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867608.88091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.88094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867608.88096: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867608.88098: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867608.88100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867608.88102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867608.88105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867608.88106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867608.88108: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867608.88110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867608.88274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867608.88283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867608.88286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867608.88288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867608.90094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867608.90118: stderr chunk (state=3): >>><<< 30575 1726867608.90123: stdout chunk (state=3): >>><<< 30575 1726867608.90139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867608.90144: handler run complete 30575 1726867608.90157: attempt loop complete, returning result 30575 1726867608.90159: _execute() done 30575 1726867608.90162: dumping result to json 30575 1726867608.90164: done dumping result, returning 30575 1726867608.90173: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-000000000d2b] 30575 1726867608.90180: sending task result for task 0affcac9-a3a5-e081-a588-000000000d2b 30575 1726867608.90267: done sending task result for task 0affcac9-a3a5-e081-a588-000000000d2b 30575 1726867608.90269: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867608.90352: no more pending results, returning what we have 30575 1726867608.90355: results queue empty 30575 1726867608.90356: checking for any_errors_fatal 30575 1726867608.90361: done checking for any_errors_fatal 30575 1726867608.90362: checking for max_fail_percentage 30575 1726867608.90363: done checking for max_fail_percentage 30575 1726867608.90364: checking to see if all hosts have failed and the running result is not ok 30575 1726867608.90365: done checking to see if all hosts have failed 30575 1726867608.90366: getting the remaining hosts for this loop 30575 1726867608.90367: done getting the remaining hosts for this loop 30575 1726867608.90371: getting the next task for host managed_node3 30575 1726867608.90383: done getting next task for host managed_node3 30575 1726867608.90386: ^ task is: TASK: meta (role_complete) 30575 1726867608.90390: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867608.90402: getting variables 30575 1726867608.90403: in VariableManager get_vars() 30575 1726867608.90440: Calling all_inventory to load vars for managed_node3 30575 1726867608.90443: Calling groups_inventory to load vars for managed_node3 30575 1726867608.90445: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867608.90453: Calling all_plugins_play to load vars for managed_node3 30575 1726867608.90456: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867608.90458: Calling groups_plugins_play to load vars for managed_node3 30575 1726867608.95255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867608.96525: done with get_vars() 30575 1726867608.96543: done getting variables 30575 1726867608.96591: done queuing things up, now waiting for results queue to drain 30575 1726867608.96593: results queue empty 30575 1726867608.96593: checking for any_errors_fatal 30575 1726867608.96595: done checking for any_errors_fatal 30575 1726867608.96595: checking for max_fail_percentage 30575 1726867608.96596: done checking for max_fail_percentage 30575 1726867608.96597: checking to see if all hosts have failed and the running result is not ok 30575 1726867608.96597: done checking to see if all hosts have failed 30575 1726867608.96598: getting the remaining hosts for this loop 30575 1726867608.96598: done getting the remaining hosts for this loop 30575 1726867608.96600: getting the next task for host managed_node3 30575 1726867608.96603: done getting next task for host managed_node3 30575 1726867608.96604: ^ task is: TASK: Asserts 30575 1726867608.96606: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867608.96607: getting variables 30575 1726867608.96608: in VariableManager get_vars() 30575 1726867608.96614: Calling all_inventory to load vars for managed_node3 30575 1726867608.96616: Calling groups_inventory to load vars for managed_node3 30575 1726867608.96619: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867608.96623: Calling all_plugins_play to load vars for managed_node3 30575 1726867608.96624: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867608.96626: Calling groups_plugins_play to load vars for managed_node3 30575 1726867608.97240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867608.98468: done with get_vars() 30575 1726867608.98489: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:26:48 -0400 (0:00:00.428) 0:00:44.363 ****** 30575 1726867608.98562: entering _queue_task() for managed_node3/include_tasks 30575 1726867608.98923: worker is 1 (out of 1 available) 30575 1726867608.98934: exiting _queue_task() for managed_node3/include_tasks 30575 1726867608.98947: done queuing things up, now waiting for results queue to drain 30575 1726867608.98949: waiting for pending results... 30575 1726867608.99323: running TaskExecutor() for managed_node3/TASK: Asserts 30575 1726867608.99390: in run() - task 0affcac9-a3a5-e081-a588-000000000a4e 30575 1726867608.99583: variable 'ansible_search_path' from source: unknown 30575 1726867608.99587: variable 'ansible_search_path' from source: unknown 30575 1726867608.99589: variable 'lsr_assert' from source: include params 30575 1726867608.99692: variable 'lsr_assert' from source: include params 30575 1726867608.99972: variable 'omit' from source: magic vars 30575 1726867609.00186: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.00190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.00200: variable 'omit' from source: magic vars 30575 1726867609.00496: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.00506: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.00524: variable 'item' from source: unknown 30575 1726867609.00625: variable 'item' from source: unknown 30575 1726867609.00650: variable 'item' from source: unknown 30575 1726867609.00679: variable 'item' from source: unknown 30575 1726867609.00846: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.00849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.00852: variable 'omit' from source: magic vars 30575 1726867609.00932: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.00935: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.00941: variable 'item' from source: unknown 30575 1726867609.00988: variable 'item' from source: unknown 30575 1726867609.01006: variable 'item' from source: unknown 30575 1726867609.01059: variable 'item' from source: unknown 30575 1726867609.01120: dumping result to json 30575 1726867609.01130: done dumping result, returning 30575 1726867609.01132: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-e081-a588-000000000a4e] 30575 1726867609.01135: sending task result for task 0affcac9-a3a5-e081-a588-000000000a4e 30575 1726867609.01166: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a4e 30575 1726867609.01170: WORKER PROCESS EXITING 30575 1726867609.01198: no more pending results, returning what we have 30575 1726867609.01202: in VariableManager get_vars() 30575 1726867609.01244: Calling all_inventory to load vars for managed_node3 30575 1726867609.01246: Calling groups_inventory to load vars for managed_node3 30575 1726867609.01250: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.01263: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.01266: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.01269: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.02348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.03606: done with get_vars() 30575 1726867609.03623: variable 'ansible_search_path' from source: unknown 30575 1726867609.03624: variable 'ansible_search_path' from source: unknown 30575 1726867609.03662: variable 'ansible_search_path' from source: unknown 30575 1726867609.03664: variable 'ansible_search_path' from source: unknown 30575 1726867609.03704: we have included files to process 30575 1726867609.03705: generating all_blocks data 30575 1726867609.03707: done generating all_blocks data 30575 1726867609.03712: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867609.03713: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867609.03715: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867609.03823: in VariableManager get_vars() 30575 1726867609.03842: done with get_vars() 30575 1726867609.03960: done processing included file 30575 1726867609.03962: iterating over new_blocks loaded from include file 30575 1726867609.03963: in VariableManager get_vars() 30575 1726867609.03976: done with get_vars() 30575 1726867609.03980: filtering new block on tags 30575 1726867609.04023: done filtering new block on tags 30575 1726867609.04026: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 => (item=tasks/assert_device_present.yml) 30575 1726867609.04031: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867609.04032: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867609.04034: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30575 1726867609.04145: in VariableManager get_vars() 30575 1726867609.04163: done with get_vars() 30575 1726867609.04406: done processing included file 30575 1726867609.04408: iterating over new_blocks loaded from include file 30575 1726867609.04409: in VariableManager get_vars() 30575 1726867609.04422: done with get_vars() 30575 1726867609.04423: filtering new block on tags 30575 1726867609.04483: done filtering new block on tags 30575 1726867609.04486: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=tasks/assert_profile_present.yml) 30575 1726867609.04489: extending task lists for all hosts with included blocks 30575 1726867609.05445: done extending task lists 30575 1726867609.05446: done processing included files 30575 1726867609.05447: results queue empty 30575 1726867609.05448: checking for any_errors_fatal 30575 1726867609.05449: done checking for any_errors_fatal 30575 1726867609.05450: checking for max_fail_percentage 30575 1726867609.05451: done checking for max_fail_percentage 30575 1726867609.05452: checking to see if all hosts have failed and the running result is not ok 30575 1726867609.05453: done checking to see if all hosts have failed 30575 1726867609.05454: getting the remaining hosts for this loop 30575 1726867609.05455: done getting the remaining hosts for this loop 30575 1726867609.05457: getting the next task for host managed_node3 30575 1726867609.05462: done getting next task for host managed_node3 30575 1726867609.05464: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867609.05467: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867609.05486: getting variables 30575 1726867609.05487: in VariableManager get_vars() 30575 1726867609.05503: Calling all_inventory to load vars for managed_node3 30575 1726867609.05506: Calling groups_inventory to load vars for managed_node3 30575 1726867609.05512: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.05520: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.05522: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.05526: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.06761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.08500: done with get_vars() 30575 1726867609.08522: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:26:49 -0400 (0:00:00.100) 0:00:44.463 ****** 30575 1726867609.08610: entering _queue_task() for managed_node3/include_tasks 30575 1726867609.09016: worker is 1 (out of 1 available) 30575 1726867609.09028: exiting _queue_task() for managed_node3/include_tasks 30575 1726867609.09045: done queuing things up, now waiting for results queue to drain 30575 1726867609.09049: waiting for pending results... 30575 1726867609.09499: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867609.09504: in run() - task 0affcac9-a3a5-e081-a588-000000000e86 30575 1726867609.09514: variable 'ansible_search_path' from source: unknown 30575 1726867609.09524: variable 'ansible_search_path' from source: unknown 30575 1726867609.09563: calling self._execute() 30575 1726867609.09656: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.09674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.09698: variable 'omit' from source: magic vars 30575 1726867609.10124: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.10146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.10158: _execute() done 30575 1726867609.10249: dumping result to json 30575 1726867609.10257: done dumping result, returning 30575 1726867609.10261: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-000000000e86] 30575 1726867609.10263: sending task result for task 0affcac9-a3a5-e081-a588-000000000e86 30575 1726867609.10341: done sending task result for task 0affcac9-a3a5-e081-a588-000000000e86 30575 1726867609.10344: WORKER PROCESS EXITING 30575 1726867609.10381: no more pending results, returning what we have 30575 1726867609.10388: in VariableManager get_vars() 30575 1726867609.10428: Calling all_inventory to load vars for managed_node3 30575 1726867609.10431: Calling groups_inventory to load vars for managed_node3 30575 1726867609.10435: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.10449: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.10452: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.10455: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.12137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.13758: done with get_vars() 30575 1726867609.13780: variable 'ansible_search_path' from source: unknown 30575 1726867609.13782: variable 'ansible_search_path' from source: unknown 30575 1726867609.13790: variable 'item' from source: include params 30575 1726867609.13922: variable 'item' from source: include params 30575 1726867609.13958: we have included files to process 30575 1726867609.13960: generating all_blocks data 30575 1726867609.13962: done generating all_blocks data 30575 1726867609.13963: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867609.13966: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867609.13968: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867609.14166: done processing included file 30575 1726867609.14168: iterating over new_blocks loaded from include file 30575 1726867609.14170: in VariableManager get_vars() 30575 1726867609.14187: done with get_vars() 30575 1726867609.14188: filtering new block on tags 30575 1726867609.14218: done filtering new block on tags 30575 1726867609.14220: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867609.14227: extending task lists for all hosts with included blocks 30575 1726867609.14425: done extending task lists 30575 1726867609.14428: done processing included files 30575 1726867609.14429: results queue empty 30575 1726867609.14430: checking for any_errors_fatal 30575 1726867609.14433: done checking for any_errors_fatal 30575 1726867609.14434: checking for max_fail_percentage 30575 1726867609.14435: done checking for max_fail_percentage 30575 1726867609.14436: checking to see if all hosts have failed and the running result is not ok 30575 1726867609.14437: done checking to see if all hosts have failed 30575 1726867609.14437: getting the remaining hosts for this loop 30575 1726867609.14439: done getting the remaining hosts for this loop 30575 1726867609.14441: getting the next task for host managed_node3 30575 1726867609.14446: done getting next task for host managed_node3 30575 1726867609.14448: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867609.14452: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867609.14454: getting variables 30575 1726867609.14455: in VariableManager get_vars() 30575 1726867609.14465: Calling all_inventory to load vars for managed_node3 30575 1726867609.14467: Calling groups_inventory to load vars for managed_node3 30575 1726867609.14469: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.14474: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.14478: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.14481: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.15745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.17258: done with get_vars() 30575 1726867609.17285: done getting variables 30575 1726867609.17441: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:26:49 -0400 (0:00:00.088) 0:00:44.552 ****** 30575 1726867609.17469: entering _queue_task() for managed_node3/stat 30575 1726867609.17838: worker is 1 (out of 1 available) 30575 1726867609.17852: exiting _queue_task() for managed_node3/stat 30575 1726867609.17866: done queuing things up, now waiting for results queue to drain 30575 1726867609.17868: waiting for pending results... 30575 1726867609.18213: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867609.18583: in run() - task 0affcac9-a3a5-e081-a588-000000000ef5 30575 1726867609.18587: variable 'ansible_search_path' from source: unknown 30575 1726867609.18589: variable 'ansible_search_path' from source: unknown 30575 1726867609.18592: calling self._execute() 30575 1726867609.18594: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.18597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.18599: variable 'omit' from source: magic vars 30575 1726867609.18973: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.18993: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.19005: variable 'omit' from source: magic vars 30575 1726867609.19075: variable 'omit' from source: magic vars 30575 1726867609.19193: variable 'interface' from source: play vars 30575 1726867609.19218: variable 'omit' from source: magic vars 30575 1726867609.19267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867609.19306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867609.19331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867609.19352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.19373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.19412: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867609.19425: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.19435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.19539: Set connection var ansible_pipelining to False 30575 1726867609.19548: Set connection var ansible_shell_type to sh 30575 1726867609.19559: Set connection var ansible_shell_executable to /bin/sh 30575 1726867609.19582: Set connection var ansible_timeout to 10 30575 1726867609.19585: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867609.19592: Set connection var ansible_connection to ssh 30575 1726867609.19691: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.19695: variable 'ansible_connection' from source: unknown 30575 1726867609.19697: variable 'ansible_module_compression' from source: unknown 30575 1726867609.19699: variable 'ansible_shell_type' from source: unknown 30575 1726867609.19701: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.19703: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.19709: variable 'ansible_pipelining' from source: unknown 30575 1726867609.19712: variable 'ansible_timeout' from source: unknown 30575 1726867609.19714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.19887: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867609.19911: variable 'omit' from source: magic vars 30575 1726867609.19923: starting attempt loop 30575 1726867609.19930: running the handler 30575 1726867609.19954: _low_level_execute_command(): starting 30575 1726867609.19976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867609.20800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867609.20902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.20924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.20941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.20964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.21058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.22772: stdout chunk (state=3): >>>/root <<< 30575 1726867609.22941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.22944: stdout chunk (state=3): >>><<< 30575 1726867609.22947: stderr chunk (state=3): >>><<< 30575 1726867609.22986: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867609.23075: _low_level_execute_command(): starting 30575 1726867609.23081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413 `" && echo ansible-tmp-1726867609.2299402-32727-204824962897413="` echo /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413 `" ) && sleep 0' 30575 1726867609.23730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867609.23736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.23760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867609.23770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.23819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.23823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.23843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.23898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.25947: stdout chunk (state=3): >>>ansible-tmp-1726867609.2299402-32727-204824962897413=/root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413 <<< 30575 1726867609.25951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.25953: stderr chunk (state=3): >>><<< 30575 1726867609.25955: stdout chunk (state=3): >>><<< 30575 1726867609.26038: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867609.2299402-32727-204824962897413=/root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867609.26042: variable 'ansible_module_compression' from source: unknown 30575 1726867609.26101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867609.26338: variable 'ansible_facts' from source: unknown 30575 1726867609.26435: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/AnsiballZ_stat.py 30575 1726867609.26798: Sending initial data 30575 1726867609.26801: Sent initial data (153 bytes) 30575 1726867609.27384: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867609.27394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867609.27426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867609.27429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.27432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867609.27434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.27483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.27497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.27542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.29309: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867609.29314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867609.29316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvv4j83kn /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/AnsiballZ_stat.py <<< 30575 1726867609.29319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/AnsiballZ_stat.py" <<< 30575 1726867609.29321: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvv4j83kn" to remote "/root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/AnsiballZ_stat.py" <<< 30575 1726867609.30889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.30983: stderr chunk (state=3): >>><<< 30575 1726867609.30986: stdout chunk (state=3): >>><<< 30575 1726867609.30989: done transferring module to remote 30575 1726867609.30991: _low_level_execute_command(): starting 30575 1726867609.30993: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/ /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/AnsiballZ_stat.py && sleep 0' 30575 1726867609.32909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867609.32924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867609.32943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.33216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.33230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.33515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.33669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.35637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.35641: stdout chunk (state=3): >>><<< 30575 1726867609.35643: stderr chunk (state=3): >>><<< 30575 1726867609.35661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867609.35671: _low_level_execute_command(): starting 30575 1726867609.35683: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/AnsiballZ_stat.py && sleep 0' 30575 1726867609.36693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867609.36701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867609.36704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867609.36764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.36841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.36844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.36982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.36997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.52259: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31319, "dev": 23, "nlink": 1, "atime": 1726867600.8171837, "mtime": 1726867600.8171837, "ctime": 1726867600.8171837, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867609.53608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867609.53612: stdout chunk (state=3): >>><<< 30575 1726867609.53614: stderr chunk (state=3): >>><<< 30575 1726867609.53640: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31319, "dev": 23, "nlink": 1, "atime": 1726867600.8171837, "mtime": 1726867600.8171837, "ctime": 1726867600.8171837, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867609.53709: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867609.53713: _low_level_execute_command(): starting 30575 1726867609.53719: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867609.2299402-32727-204824962897413/ > /dev/null 2>&1 && sleep 0' 30575 1726867609.54240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867609.54245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867609.54313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867609.54316: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.54329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.54373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.56219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.56230: stderr chunk (state=3): >>><<< 30575 1726867609.56233: stdout chunk (state=3): >>><<< 30575 1726867609.56247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867609.56252: handler run complete 30575 1726867609.56283: attempt loop complete, returning result 30575 1726867609.56286: _execute() done 30575 1726867609.56288: dumping result to json 30575 1726867609.56294: done dumping result, returning 30575 1726867609.56301: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-000000000ef5] 30575 1726867609.56305: sending task result for task 0affcac9-a3a5-e081-a588-000000000ef5 30575 1726867609.56411: done sending task result for task 0affcac9-a3a5-e081-a588-000000000ef5 30575 1726867609.56414: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867600.8171837, "block_size": 4096, "blocks": 0, "ctime": 1726867600.8171837, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31319, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726867600.8171837, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30575 1726867609.56509: no more pending results, returning what we have 30575 1726867609.56513: results queue empty 30575 1726867609.56513: checking for any_errors_fatal 30575 1726867609.56514: done checking for any_errors_fatal 30575 1726867609.56515: checking for max_fail_percentage 30575 1726867609.56519: done checking for max_fail_percentage 30575 1726867609.56520: checking to see if all hosts have failed and the running result is not ok 30575 1726867609.56521: done checking to see if all hosts have failed 30575 1726867609.56522: getting the remaining hosts for this loop 30575 1726867609.56523: done getting the remaining hosts for this loop 30575 1726867609.56527: getting the next task for host managed_node3 30575 1726867609.56537: done getting next task for host managed_node3 30575 1726867609.56540: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30575 1726867609.56543: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867609.56549: getting variables 30575 1726867609.56551: in VariableManager get_vars() 30575 1726867609.56613: Calling all_inventory to load vars for managed_node3 30575 1726867609.56615: Calling groups_inventory to load vars for managed_node3 30575 1726867609.56621: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.56631: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.56633: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.56636: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.57446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.58919: done with get_vars() 30575 1726867609.58949: done getting variables 30575 1726867609.59028: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867609.59118: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:26:49 -0400 (0:00:00.416) 0:00:44.969 ****** 30575 1726867609.59157: entering _queue_task() for managed_node3/assert 30575 1726867609.59463: worker is 1 (out of 1 available) 30575 1726867609.59479: exiting _queue_task() for managed_node3/assert 30575 1726867609.59492: done queuing things up, now waiting for results queue to drain 30575 1726867609.59493: waiting for pending results... 30575 1726867609.59788: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' 30575 1726867609.59871: in run() - task 0affcac9-a3a5-e081-a588-000000000e87 30575 1726867609.59941: variable 'ansible_search_path' from source: unknown 30575 1726867609.59945: variable 'ansible_search_path' from source: unknown 30575 1726867609.59948: calling self._execute() 30575 1726867609.60066: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.60072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.60075: variable 'omit' from source: magic vars 30575 1726867609.60405: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.60419: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.60426: variable 'omit' from source: magic vars 30575 1726867609.60470: variable 'omit' from source: magic vars 30575 1726867609.60541: variable 'interface' from source: play vars 30575 1726867609.60558: variable 'omit' from source: magic vars 30575 1726867609.60589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867609.60615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867609.60637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867609.60649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.60659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.60687: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867609.60690: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.60693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.60764: Set connection var ansible_pipelining to False 30575 1726867609.60767: Set connection var ansible_shell_type to sh 30575 1726867609.60778: Set connection var ansible_shell_executable to /bin/sh 30575 1726867609.60781: Set connection var ansible_timeout to 10 30575 1726867609.60783: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867609.60792: Set connection var ansible_connection to ssh 30575 1726867609.60809: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.60813: variable 'ansible_connection' from source: unknown 30575 1726867609.60815: variable 'ansible_module_compression' from source: unknown 30575 1726867609.60817: variable 'ansible_shell_type' from source: unknown 30575 1726867609.60821: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.60824: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.60830: variable 'ansible_pipelining' from source: unknown 30575 1726867609.60832: variable 'ansible_timeout' from source: unknown 30575 1726867609.60834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.60938: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867609.60947: variable 'omit' from source: magic vars 30575 1726867609.60957: starting attempt loop 30575 1726867609.60960: running the handler 30575 1726867609.61081: variable 'interface_stat' from source: set_fact 30575 1726867609.61096: Evaluated conditional (interface_stat.stat.exists): True 30575 1726867609.61102: handler run complete 30575 1726867609.61113: attempt loop complete, returning result 30575 1726867609.61116: _execute() done 30575 1726867609.61118: dumping result to json 30575 1726867609.61124: done dumping result, returning 30575 1726867609.61131: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' [0affcac9-a3a5-e081-a588-000000000e87] 30575 1726867609.61136: sending task result for task 0affcac9-a3a5-e081-a588-000000000e87 30575 1726867609.61214: done sending task result for task 0affcac9-a3a5-e081-a588-000000000e87 30575 1726867609.61217: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867609.61268: no more pending results, returning what we have 30575 1726867609.61272: results queue empty 30575 1726867609.61273: checking for any_errors_fatal 30575 1726867609.61284: done checking for any_errors_fatal 30575 1726867609.61285: checking for max_fail_percentage 30575 1726867609.61286: done checking for max_fail_percentage 30575 1726867609.61287: checking to see if all hosts have failed and the running result is not ok 30575 1726867609.61288: done checking to see if all hosts have failed 30575 1726867609.61289: getting the remaining hosts for this loop 30575 1726867609.61290: done getting the remaining hosts for this loop 30575 1726867609.61294: getting the next task for host managed_node3 30575 1726867609.61305: done getting next task for host managed_node3 30575 1726867609.61307: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30575 1726867609.61310: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867609.61314: getting variables 30575 1726867609.61315: in VariableManager get_vars() 30575 1726867609.61356: Calling all_inventory to load vars for managed_node3 30575 1726867609.61359: Calling groups_inventory to load vars for managed_node3 30575 1726867609.61362: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.61372: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.61374: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.61378: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.62473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.63481: done with get_vars() 30575 1726867609.63499: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:26:49 -0400 (0:00:00.044) 0:00:45.013 ****** 30575 1726867609.63585: entering _queue_task() for managed_node3/include_tasks 30575 1726867609.63800: worker is 1 (out of 1 available) 30575 1726867609.63813: exiting _queue_task() for managed_node3/include_tasks 30575 1726867609.63826: done queuing things up, now waiting for results queue to drain 30575 1726867609.63828: waiting for pending results... 30575 1726867609.64018: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30575 1726867609.64136: in run() - task 0affcac9-a3a5-e081-a588-000000000e8b 30575 1726867609.64140: variable 'ansible_search_path' from source: unknown 30575 1726867609.64143: variable 'ansible_search_path' from source: unknown 30575 1726867609.64173: calling self._execute() 30575 1726867609.64263: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.64269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.64314: variable 'omit' from source: magic vars 30575 1726867609.64665: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.64668: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.64672: _execute() done 30575 1726867609.64675: dumping result to json 30575 1726867609.64680: done dumping result, returning 30575 1726867609.64683: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-e081-a588-000000000e8b] 30575 1726867609.64685: sending task result for task 0affcac9-a3a5-e081-a588-000000000e8b 30575 1726867609.64854: no more pending results, returning what we have 30575 1726867609.64858: in VariableManager get_vars() 30575 1726867609.64895: Calling all_inventory to load vars for managed_node3 30575 1726867609.64897: Calling groups_inventory to load vars for managed_node3 30575 1726867609.64900: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.64911: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.64914: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.64916: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.65468: done sending task result for task 0affcac9-a3a5-e081-a588-000000000e8b 30575 1726867609.65472: WORKER PROCESS EXITING 30575 1726867609.65946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.66946: done with get_vars() 30575 1726867609.66959: variable 'ansible_search_path' from source: unknown 30575 1726867609.66960: variable 'ansible_search_path' from source: unknown 30575 1726867609.66967: variable 'item' from source: include params 30575 1726867609.67039: variable 'item' from source: include params 30575 1726867609.67061: we have included files to process 30575 1726867609.67061: generating all_blocks data 30575 1726867609.67063: done generating all_blocks data 30575 1726867609.67065: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867609.67065: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867609.67067: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867609.68330: done processing included file 30575 1726867609.68335: iterating over new_blocks loaded from include file 30575 1726867609.68337: in VariableManager get_vars() 30575 1726867609.68356: done with get_vars() 30575 1726867609.68359: filtering new block on tags 30575 1726867609.68464: done filtering new block on tags 30575 1726867609.68468: in VariableManager get_vars() 30575 1726867609.68490: done with get_vars() 30575 1726867609.68492: filtering new block on tags 30575 1726867609.68550: done filtering new block on tags 30575 1726867609.68553: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30575 1726867609.68559: extending task lists for all hosts with included blocks 30575 1726867609.68968: done extending task lists 30575 1726867609.68970: done processing included files 30575 1726867609.68971: results queue empty 30575 1726867609.68971: checking for any_errors_fatal 30575 1726867609.68975: done checking for any_errors_fatal 30575 1726867609.68975: checking for max_fail_percentage 30575 1726867609.68976: done checking for max_fail_percentage 30575 1726867609.68979: checking to see if all hosts have failed and the running result is not ok 30575 1726867609.68980: done checking to see if all hosts have failed 30575 1726867609.68981: getting the remaining hosts for this loop 30575 1726867609.68982: done getting the remaining hosts for this loop 30575 1726867609.68985: getting the next task for host managed_node3 30575 1726867609.68991: done getting next task for host managed_node3 30575 1726867609.68993: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867609.68996: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867609.68998: getting variables 30575 1726867609.68999: in VariableManager get_vars() 30575 1726867609.69016: Calling all_inventory to load vars for managed_node3 30575 1726867609.69019: Calling groups_inventory to load vars for managed_node3 30575 1726867609.69021: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.69030: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.69033: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.69036: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.70230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.72569: done with get_vars() 30575 1726867609.72593: done getting variables 30575 1726867609.72652: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:26:49 -0400 (0:00:00.090) 0:00:45.104 ****** 30575 1726867609.72687: entering _queue_task() for managed_node3/set_fact 30575 1726867609.73115: worker is 1 (out of 1 available) 30575 1726867609.73135: exiting _queue_task() for managed_node3/set_fact 30575 1726867609.73150: done queuing things up, now waiting for results queue to drain 30575 1726867609.73161: waiting for pending results... 30575 1726867609.73411: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867609.73665: in run() - task 0affcac9-a3a5-e081-a588-000000000f13 30575 1726867609.73683: variable 'ansible_search_path' from source: unknown 30575 1726867609.73709: variable 'ansible_search_path' from source: unknown 30575 1726867609.73734: calling self._execute() 30575 1726867609.73823: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.73995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.74005: variable 'omit' from source: magic vars 30575 1726867609.74447: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.74483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.74494: variable 'omit' from source: magic vars 30575 1726867609.74554: variable 'omit' from source: magic vars 30575 1726867609.74670: variable 'omit' from source: magic vars 30575 1726867609.74675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867609.74700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867609.74725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867609.74743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.74754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.74791: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867609.74799: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.74804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.74925: Set connection var ansible_pipelining to False 30575 1726867609.74929: Set connection var ansible_shell_type to sh 30575 1726867609.74931: Set connection var ansible_shell_executable to /bin/sh 30575 1726867609.74933: Set connection var ansible_timeout to 10 30575 1726867609.74935: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867609.74937: Set connection var ansible_connection to ssh 30575 1726867609.75021: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.75024: variable 'ansible_connection' from source: unknown 30575 1726867609.75028: variable 'ansible_module_compression' from source: unknown 30575 1726867609.75031: variable 'ansible_shell_type' from source: unknown 30575 1726867609.75033: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.75035: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.75037: variable 'ansible_pipelining' from source: unknown 30575 1726867609.75040: variable 'ansible_timeout' from source: unknown 30575 1726867609.75042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.75162: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867609.75166: variable 'omit' from source: magic vars 30575 1726867609.75168: starting attempt loop 30575 1726867609.75170: running the handler 30575 1726867609.75172: handler run complete 30575 1726867609.75264: attempt loop complete, returning result 30575 1726867609.75267: _execute() done 30575 1726867609.75270: dumping result to json 30575 1726867609.75272: done dumping result, returning 30575 1726867609.75274: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-e081-a588-000000000f13] 30575 1726867609.75276: sending task result for task 0affcac9-a3a5-e081-a588-000000000f13 30575 1726867609.75355: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f13 30575 1726867609.75358: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30575 1726867609.75614: no more pending results, returning what we have 30575 1726867609.75617: results queue empty 30575 1726867609.75618: checking for any_errors_fatal 30575 1726867609.75619: done checking for any_errors_fatal 30575 1726867609.75620: checking for max_fail_percentage 30575 1726867609.75622: done checking for max_fail_percentage 30575 1726867609.75623: checking to see if all hosts have failed and the running result is not ok 30575 1726867609.75624: done checking to see if all hosts have failed 30575 1726867609.75625: getting the remaining hosts for this loop 30575 1726867609.75626: done getting the remaining hosts for this loop 30575 1726867609.75629: getting the next task for host managed_node3 30575 1726867609.75637: done getting next task for host managed_node3 30575 1726867609.75640: ^ task is: TASK: Stat profile file 30575 1726867609.75645: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867609.75649: getting variables 30575 1726867609.75651: in VariableManager get_vars() 30575 1726867609.75683: Calling all_inventory to load vars for managed_node3 30575 1726867609.75686: Calling groups_inventory to load vars for managed_node3 30575 1726867609.75689: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867609.75699: Calling all_plugins_play to load vars for managed_node3 30575 1726867609.75702: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867609.75705: Calling groups_plugins_play to load vars for managed_node3 30575 1726867609.77347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867609.79219: done with get_vars() 30575 1726867609.79240: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:26:49 -0400 (0:00:00.066) 0:00:45.172 ****** 30575 1726867609.79499: entering _queue_task() for managed_node3/stat 30575 1726867609.80219: worker is 1 (out of 1 available) 30575 1726867609.80232: exiting _queue_task() for managed_node3/stat 30575 1726867609.80244: done queuing things up, now waiting for results queue to drain 30575 1726867609.80246: waiting for pending results... 30575 1726867609.80602: running TaskExecutor() for managed_node3/TASK: Stat profile file 30575 1726867609.80707: in run() - task 0affcac9-a3a5-e081-a588-000000000f14 30575 1726867609.80735: variable 'ansible_search_path' from source: unknown 30575 1726867609.80742: variable 'ansible_search_path' from source: unknown 30575 1726867609.80783: calling self._execute() 30575 1726867609.80925: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.80931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.80934: variable 'omit' from source: magic vars 30575 1726867609.81299: variable 'ansible_distribution_major_version' from source: facts 30575 1726867609.81309: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867609.81315: variable 'omit' from source: magic vars 30575 1726867609.81358: variable 'omit' from source: magic vars 30575 1726867609.81587: variable 'profile' from source: play vars 30575 1726867609.81591: variable 'interface' from source: play vars 30575 1726867609.81593: variable 'interface' from source: play vars 30575 1726867609.81595: variable 'omit' from source: magic vars 30575 1726867609.81624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867609.81659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867609.81696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867609.81728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.81746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867609.81783: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867609.82084: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.82087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.82092: Set connection var ansible_pipelining to False 30575 1726867609.82101: Set connection var ansible_shell_type to sh 30575 1726867609.82111: Set connection var ansible_shell_executable to /bin/sh 30575 1726867609.82120: Set connection var ansible_timeout to 10 30575 1726867609.82129: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867609.82139: Set connection var ansible_connection to ssh 30575 1726867609.82166: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.82173: variable 'ansible_connection' from source: unknown 30575 1726867609.82192: variable 'ansible_module_compression' from source: unknown 30575 1726867609.82199: variable 'ansible_shell_type' from source: unknown 30575 1726867609.82206: variable 'ansible_shell_executable' from source: unknown 30575 1726867609.82212: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867609.82220: variable 'ansible_pipelining' from source: unknown 30575 1726867609.82227: variable 'ansible_timeout' from source: unknown 30575 1726867609.82235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867609.82537: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867609.82551: variable 'omit' from source: magic vars 30575 1726867609.82562: starting attempt loop 30575 1726867609.82569: running the handler 30575 1726867609.82602: _low_level_execute_command(): starting 30575 1726867609.82643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867609.84140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867609.84146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867609.84152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867609.84316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867609.84323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867609.84327: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.84490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.84494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.84556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.84782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.86789: stdout chunk (state=3): >>>/root <<< 30575 1726867609.86793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.86796: stdout chunk (state=3): >>><<< 30575 1726867609.86798: stderr chunk (state=3): >>><<< 30575 1726867609.86805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867609.86808: _low_level_execute_command(): starting 30575 1726867609.86814: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626 `" && echo ansible-tmp-1726867609.866976-32773-250978233977626="` echo /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626 `" ) && sleep 0' 30575 1726867609.87385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867609.87404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867609.87426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867609.87455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867609.87548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.87608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.87627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.87694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.87813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.89695: stdout chunk (state=3): >>>ansible-tmp-1726867609.866976-32773-250978233977626=/root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626 <<< 30575 1726867609.89869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.89872: stdout chunk (state=3): >>><<< 30575 1726867609.89874: stderr chunk (state=3): >>><<< 30575 1726867609.90183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867609.866976-32773-250978233977626=/root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867609.90193: variable 'ansible_module_compression' from source: unknown 30575 1726867609.90195: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867609.90197: variable 'ansible_facts' from source: unknown 30575 1726867609.90410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/AnsiballZ_stat.py 30575 1726867609.91255: Sending initial data 30575 1726867609.91258: Sent initial data (152 bytes) 30575 1726867609.92881: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867609.92896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867609.92910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867609.92958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867609.93096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.93111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.93171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867609.94716: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867609.94903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867609.94957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpo8nckt67 /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/AnsiballZ_stat.py <<< 30575 1726867609.94960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/AnsiballZ_stat.py" <<< 30575 1726867609.94999: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpo8nckt67" to remote "/root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/AnsiballZ_stat.py" <<< 30575 1726867609.96728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867609.96791: stderr chunk (state=3): >>><<< 30575 1726867609.96809: stdout chunk (state=3): >>><<< 30575 1726867609.96843: done transferring module to remote 30575 1726867609.96864: _low_level_execute_command(): starting 30575 1726867609.97070: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/ /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/AnsiballZ_stat.py && sleep 0' 30575 1726867609.98192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867609.98392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867609.98493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867609.98569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.00320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867610.00370: stderr chunk (state=3): >>><<< 30575 1726867610.00497: stdout chunk (state=3): >>><<< 30575 1726867610.00522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867610.00531: _low_level_execute_command(): starting 30575 1726867610.00546: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/AnsiballZ_stat.py && sleep 0' 30575 1726867610.01329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867610.01341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.01353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867610.01368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867610.01385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867610.01403: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867610.01421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.01441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867610.01454: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867610.01465: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867610.01522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.01690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867610.01769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.02245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.17461: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867610.18756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867610.18767: stderr chunk (state=3): >>><<< 30575 1726867610.18792: stdout chunk (state=3): >>><<< 30575 1726867610.18821: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867610.18905: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867610.18908: _low_level_execute_command(): starting 30575 1726867610.18911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867609.866976-32773-250978233977626/ > /dev/null 2>&1 && sleep 0' 30575 1726867610.19586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867610.19590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.19592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867610.19596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867610.19599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867610.19602: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867610.19604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.19606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867610.19608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867610.19610: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867610.19616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.19711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867610.19718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867610.19721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867610.19723: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867610.19725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.19771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867610.19774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867610.19817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.20098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.21698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867610.21737: stderr chunk (state=3): >>><<< 30575 1726867610.21740: stdout chunk (state=3): >>><<< 30575 1726867610.21773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867610.21778: handler run complete 30575 1726867610.21806: attempt loop complete, returning result 30575 1726867610.21809: _execute() done 30575 1726867610.21840: dumping result to json 30575 1726867610.21845: done dumping result, returning 30575 1726867610.21847: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-e081-a588-000000000f14] 30575 1726867610.21849: sending task result for task 0affcac9-a3a5-e081-a588-000000000f14 30575 1726867610.21947: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f14 30575 1726867610.21950: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867610.22042: no more pending results, returning what we have 30575 1726867610.22046: results queue empty 30575 1726867610.22046: checking for any_errors_fatal 30575 1726867610.22057: done checking for any_errors_fatal 30575 1726867610.22057: checking for max_fail_percentage 30575 1726867610.22059: done checking for max_fail_percentage 30575 1726867610.22060: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.22060: done checking to see if all hosts have failed 30575 1726867610.22062: getting the remaining hosts for this loop 30575 1726867610.22063: done getting the remaining hosts for this loop 30575 1726867610.22067: getting the next task for host managed_node3 30575 1726867610.22076: done getting next task for host managed_node3 30575 1726867610.22129: ^ task is: TASK: Set NM profile exist flag based on the profile files 30575 1726867610.22135: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.22139: getting variables 30575 1726867610.22140: in VariableManager get_vars() 30575 1726867610.22197: Calling all_inventory to load vars for managed_node3 30575 1726867610.22200: Calling groups_inventory to load vars for managed_node3 30575 1726867610.22208: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.22221: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.22223: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.22226: Calling groups_plugins_play to load vars for managed_node3 30575 1726867610.23646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867610.24938: done with get_vars() 30575 1726867610.24955: done getting variables 30575 1726867610.25004: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:26:50 -0400 (0:00:00.455) 0:00:45.627 ****** 30575 1726867610.25030: entering _queue_task() for managed_node3/set_fact 30575 1726867610.25276: worker is 1 (out of 1 available) 30575 1726867610.25290: exiting _queue_task() for managed_node3/set_fact 30575 1726867610.25304: done queuing things up, now waiting for results queue to drain 30575 1726867610.25306: waiting for pending results... 30575 1726867610.25551: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30575 1726867610.25582: in run() - task 0affcac9-a3a5-e081-a588-000000000f15 30575 1726867610.25594: variable 'ansible_search_path' from source: unknown 30575 1726867610.25598: variable 'ansible_search_path' from source: unknown 30575 1726867610.25640: calling self._execute() 30575 1726867610.25709: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.25713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.25726: variable 'omit' from source: magic vars 30575 1726867610.26344: variable 'ansible_distribution_major_version' from source: facts 30575 1726867610.26415: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867610.26802: variable 'profile_stat' from source: set_fact 30575 1726867610.26814: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867610.26818: when evaluation is False, skipping this task 30575 1726867610.26827: _execute() done 30575 1726867610.26830: dumping result to json 30575 1726867610.26832: done dumping result, returning 30575 1726867610.26958: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-e081-a588-000000000f15] 30575 1726867610.26961: sending task result for task 0affcac9-a3a5-e081-a588-000000000f15 30575 1726867610.27046: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f15 30575 1726867610.27049: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867610.27108: no more pending results, returning what we have 30575 1726867610.27113: results queue empty 30575 1726867610.27114: checking for any_errors_fatal 30575 1726867610.27287: done checking for any_errors_fatal 30575 1726867610.27290: checking for max_fail_percentage 30575 1726867610.27292: done checking for max_fail_percentage 30575 1726867610.27293: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.27294: done checking to see if all hosts have failed 30575 1726867610.27295: getting the remaining hosts for this loop 30575 1726867610.27296: done getting the remaining hosts for this loop 30575 1726867610.27301: getting the next task for host managed_node3 30575 1726867610.27309: done getting next task for host managed_node3 30575 1726867610.27311: ^ task is: TASK: Get NM profile info 30575 1726867610.27320: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.27324: getting variables 30575 1726867610.27325: in VariableManager get_vars() 30575 1726867610.27369: Calling all_inventory to load vars for managed_node3 30575 1726867610.27372: Calling groups_inventory to load vars for managed_node3 30575 1726867610.27376: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.27442: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.27445: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.27448: Calling groups_plugins_play to load vars for managed_node3 30575 1726867610.28758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867610.29651: done with get_vars() 30575 1726867610.29667: done getting variables 30575 1726867610.29712: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:26:50 -0400 (0:00:00.047) 0:00:45.674 ****** 30575 1726867610.29737: entering _queue_task() for managed_node3/shell 30575 1726867610.30012: worker is 1 (out of 1 available) 30575 1726867610.30027: exiting _queue_task() for managed_node3/shell 30575 1726867610.30041: done queuing things up, now waiting for results queue to drain 30575 1726867610.30042: waiting for pending results... 30575 1726867610.30292: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30575 1726867610.30374: in run() - task 0affcac9-a3a5-e081-a588-000000000f16 30575 1726867610.30392: variable 'ansible_search_path' from source: unknown 30575 1726867610.30396: variable 'ansible_search_path' from source: unknown 30575 1726867610.30433: calling self._execute() 30575 1726867610.30499: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.30503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.30511: variable 'omit' from source: magic vars 30575 1726867610.30826: variable 'ansible_distribution_major_version' from source: facts 30575 1726867610.30836: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867610.30847: variable 'omit' from source: magic vars 30575 1726867610.30881: variable 'omit' from source: magic vars 30575 1726867610.30955: variable 'profile' from source: play vars 30575 1726867610.30959: variable 'interface' from source: play vars 30575 1726867610.31006: variable 'interface' from source: play vars 30575 1726867610.31023: variable 'omit' from source: magic vars 30575 1726867610.31064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867610.31094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867610.31110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867610.31125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867610.31136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867610.31160: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867610.31162: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.31165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.31238: Set connection var ansible_pipelining to False 30575 1726867610.31241: Set connection var ansible_shell_type to sh 30575 1726867610.31244: Set connection var ansible_shell_executable to /bin/sh 30575 1726867610.31250: Set connection var ansible_timeout to 10 30575 1726867610.31254: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867610.31261: Set connection var ansible_connection to ssh 30575 1726867610.31283: variable 'ansible_shell_executable' from source: unknown 30575 1726867610.31287: variable 'ansible_connection' from source: unknown 30575 1726867610.31290: variable 'ansible_module_compression' from source: unknown 30575 1726867610.31292: variable 'ansible_shell_type' from source: unknown 30575 1726867610.31295: variable 'ansible_shell_executable' from source: unknown 30575 1726867610.31297: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.31299: variable 'ansible_pipelining' from source: unknown 30575 1726867610.31301: variable 'ansible_timeout' from source: unknown 30575 1726867610.31303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.31411: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867610.31424: variable 'omit' from source: magic vars 30575 1726867610.31429: starting attempt loop 30575 1726867610.31432: running the handler 30575 1726867610.31441: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867610.31457: _low_level_execute_command(): starting 30575 1726867610.31464: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867610.32160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.32164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.32166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.32169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867610.32172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.32231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867610.32234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867610.32241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.32283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.33909: stdout chunk (state=3): >>>/root <<< 30575 1726867610.34007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867610.34063: stderr chunk (state=3): >>><<< 30575 1726867610.34067: stdout chunk (state=3): >>><<< 30575 1726867610.34090: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867610.34103: _low_level_execute_command(): starting 30575 1726867610.34113: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899 `" && echo ansible-tmp-1726867610.3409207-32810-103691477672899="` echo /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899 `" ) && sleep 0' 30575 1726867610.34545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.34549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867610.34551: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.34553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.34555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867610.34557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.34605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867610.34608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.34656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.36574: stdout chunk (state=3): >>>ansible-tmp-1726867610.3409207-32810-103691477672899=/root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899 <<< 30575 1726867610.36719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867610.36722: stderr chunk (state=3): >>><<< 30575 1726867610.36738: stdout chunk (state=3): >>><<< 30575 1726867610.36763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867610.3409207-32810-103691477672899=/root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867610.36793: variable 'ansible_module_compression' from source: unknown 30575 1726867610.36831: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867610.36903: variable 'ansible_facts' from source: unknown 30575 1726867610.37018: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/AnsiballZ_command.py 30575 1726867610.37139: Sending initial data 30575 1726867610.37142: Sent initial data (156 bytes) 30575 1726867610.37695: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.37744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867610.37760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867610.37800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.37957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.39449: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867610.39460: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867610.39494: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867610.39542: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4crq351n /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/AnsiballZ_command.py <<< 30575 1726867610.39545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/AnsiballZ_command.py" <<< 30575 1726867610.39585: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4crq351n" to remote "/root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/AnsiballZ_command.py" <<< 30575 1726867610.39589: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/AnsiballZ_command.py" <<< 30575 1726867610.40152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867610.40156: stderr chunk (state=3): >>><<< 30575 1726867610.40161: stdout chunk (state=3): >>><<< 30575 1726867610.40204: done transferring module to remote 30575 1726867610.40213: _low_level_execute_command(): starting 30575 1726867610.40220: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/ /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/AnsiballZ_command.py && sleep 0' 30575 1726867610.41024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867610.41048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.41136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.43188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867610.43191: stdout chunk (state=3): >>><<< 30575 1726867610.43193: stderr chunk (state=3): >>><<< 30575 1726867610.43195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867610.43197: _low_level_execute_command(): starting 30575 1726867610.43199: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/AnsiballZ_command.py && sleep 0' 30575 1726867610.44188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867610.44207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.44222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867610.44255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.44353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867610.44611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.44805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.61572: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:26:50.596700", "end": "2024-09-20 17:26:50.613405", "delta": "0:00:00.016705", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867610.63070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867610.63298: stderr chunk (state=3): >>><<< 30575 1726867610.63302: stdout chunk (state=3): >>><<< 30575 1726867610.63306: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:26:50.596700", "end": "2024-09-20 17:26:50.613405", "delta": "0:00:00.016705", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867610.63309: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867610.63315: _low_level_execute_command(): starting 30575 1726867610.63319: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867610.3409207-32810-103691477672899/ > /dev/null 2>&1 && sleep 0' 30575 1726867610.64365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867610.64487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867610.64507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867610.64553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867610.64562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867610.64589: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867610.64605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867610.64653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867610.64748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867610.64828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867610.64927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867610.66740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867610.66755: stdout chunk (state=3): >>><<< 30575 1726867610.66765: stderr chunk (state=3): >>><<< 30575 1726867610.66787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867610.66800: handler run complete 30575 1726867610.66827: Evaluated conditional (False): False 30575 1726867610.66842: attempt loop complete, returning result 30575 1726867610.66849: _execute() done 30575 1726867610.66863: dumping result to json 30575 1726867610.66872: done dumping result, returning 30575 1726867610.66963: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-e081-a588-000000000f16] 30575 1726867610.66966: sending task result for task 0affcac9-a3a5-e081-a588-000000000f16 30575 1726867610.67038: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f16 30575 1726867610.67042: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016705", "end": "2024-09-20 17:26:50.613405", "rc": 0, "start": "2024-09-20 17:26:50.596700" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30575 1726867610.67118: no more pending results, returning what we have 30575 1726867610.67122: results queue empty 30575 1726867610.67123: checking for any_errors_fatal 30575 1726867610.67130: done checking for any_errors_fatal 30575 1726867610.67131: checking for max_fail_percentage 30575 1726867610.67133: done checking for max_fail_percentage 30575 1726867610.67134: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.67135: done checking to see if all hosts have failed 30575 1726867610.67136: getting the remaining hosts for this loop 30575 1726867610.67138: done getting the remaining hosts for this loop 30575 1726867610.67142: getting the next task for host managed_node3 30575 1726867610.67150: done getting next task for host managed_node3 30575 1726867610.67152: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867610.67158: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.67162: getting variables 30575 1726867610.67164: in VariableManager get_vars() 30575 1726867610.67310: Calling all_inventory to load vars for managed_node3 30575 1726867610.67313: Calling groups_inventory to load vars for managed_node3 30575 1726867610.67317: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.67330: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.67333: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.67336: Calling groups_plugins_play to load vars for managed_node3 30575 1726867610.69563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867610.72558: done with get_vars() 30575 1726867610.72581: done getting variables 30575 1726867610.72647: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:26:50 -0400 (0:00:00.429) 0:00:46.104 ****** 30575 1726867610.72682: entering _queue_task() for managed_node3/set_fact 30575 1726867610.73124: worker is 1 (out of 1 available) 30575 1726867610.73137: exiting _queue_task() for managed_node3/set_fact 30575 1726867610.73153: done queuing things up, now waiting for results queue to drain 30575 1726867610.73155: waiting for pending results... 30575 1726867610.73609: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867610.73615: in run() - task 0affcac9-a3a5-e081-a588-000000000f17 30575 1726867610.73622: variable 'ansible_search_path' from source: unknown 30575 1726867610.73625: variable 'ansible_search_path' from source: unknown 30575 1726867610.73628: calling self._execute() 30575 1726867610.73719: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.73723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.73806: variable 'omit' from source: magic vars 30575 1726867610.74081: variable 'ansible_distribution_major_version' from source: facts 30575 1726867610.74093: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867610.74222: variable 'nm_profile_exists' from source: set_fact 30575 1726867610.74230: Evaluated conditional (nm_profile_exists.rc == 0): True 30575 1726867610.74241: variable 'omit' from source: magic vars 30575 1726867610.74289: variable 'omit' from source: magic vars 30575 1726867610.74322: variable 'omit' from source: magic vars 30575 1726867610.74358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867610.74393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867610.74411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867610.74443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867610.74449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867610.74545: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867610.74548: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.74551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.74596: Set connection var ansible_pipelining to False 30575 1726867610.74599: Set connection var ansible_shell_type to sh 30575 1726867610.74636: Set connection var ansible_shell_executable to /bin/sh 30575 1726867610.74639: Set connection var ansible_timeout to 10 30575 1726867610.74644: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867610.74648: Set connection var ansible_connection to ssh 30575 1726867610.74730: variable 'ansible_shell_executable' from source: unknown 30575 1726867610.74733: variable 'ansible_connection' from source: unknown 30575 1726867610.74736: variable 'ansible_module_compression' from source: unknown 30575 1726867610.74738: variable 'ansible_shell_type' from source: unknown 30575 1726867610.74740: variable 'ansible_shell_executable' from source: unknown 30575 1726867610.74741: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.74743: variable 'ansible_pipelining' from source: unknown 30575 1726867610.74745: variable 'ansible_timeout' from source: unknown 30575 1726867610.74748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.74847: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867610.74851: variable 'omit' from source: magic vars 30575 1726867610.74854: starting attempt loop 30575 1726867610.74856: running the handler 30575 1726867610.74869: handler run complete 30575 1726867610.74919: attempt loop complete, returning result 30575 1726867610.74994: _execute() done 30575 1726867610.75014: dumping result to json 30575 1726867610.75019: done dumping result, returning 30575 1726867610.75021: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-e081-a588-000000000f17] 30575 1726867610.75023: sending task result for task 0affcac9-a3a5-e081-a588-000000000f17 30575 1726867610.75081: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f17 30575 1726867610.75084: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30575 1726867610.75198: no more pending results, returning what we have 30575 1726867610.75201: results queue empty 30575 1726867610.75202: checking for any_errors_fatal 30575 1726867610.75208: done checking for any_errors_fatal 30575 1726867610.75209: checking for max_fail_percentage 30575 1726867610.75210: done checking for max_fail_percentage 30575 1726867610.75211: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.75328: done checking to see if all hosts have failed 30575 1726867610.75329: getting the remaining hosts for this loop 30575 1726867610.75330: done getting the remaining hosts for this loop 30575 1726867610.75335: getting the next task for host managed_node3 30575 1726867610.75347: done getting next task for host managed_node3 30575 1726867610.75349: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867610.75354: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.75357: getting variables 30575 1726867610.75359: in VariableManager get_vars() 30575 1726867610.75388: Calling all_inventory to load vars for managed_node3 30575 1726867610.75392: Calling groups_inventory to load vars for managed_node3 30575 1726867610.75395: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.75404: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.75407: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.75410: Calling groups_plugins_play to load vars for managed_node3 30575 1726867610.79819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867610.82235: done with get_vars() 30575 1726867610.82257: done getting variables 30575 1726867610.82335: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867610.82527: variable 'profile' from source: play vars 30575 1726867610.82531: variable 'interface' from source: play vars 30575 1726867610.82598: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:26:50 -0400 (0:00:00.099) 0:00:46.203 ****** 30575 1726867610.82631: entering _queue_task() for managed_node3/command 30575 1726867610.83102: worker is 1 (out of 1 available) 30575 1726867610.83112: exiting _queue_task() for managed_node3/command 30575 1726867610.83123: done queuing things up, now waiting for results queue to drain 30575 1726867610.83125: waiting for pending results... 30575 1726867610.83504: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30575 1726867610.83508: in run() - task 0affcac9-a3a5-e081-a588-000000000f19 30575 1726867610.83511: variable 'ansible_search_path' from source: unknown 30575 1726867610.83522: variable 'ansible_search_path' from source: unknown 30575 1726867610.83528: calling self._execute() 30575 1726867610.83629: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.83643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.83658: variable 'omit' from source: magic vars 30575 1726867610.84043: variable 'ansible_distribution_major_version' from source: facts 30575 1726867610.84054: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867610.84174: variable 'profile_stat' from source: set_fact 30575 1726867610.84185: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867610.84188: when evaluation is False, skipping this task 30575 1726867610.84191: _execute() done 30575 1726867610.84195: dumping result to json 30575 1726867610.84198: done dumping result, returning 30575 1726867610.84206: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000f19] 30575 1726867610.84211: sending task result for task 0affcac9-a3a5-e081-a588-000000000f19 30575 1726867610.84311: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f19 30575 1726867610.84314: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867610.84363: no more pending results, returning what we have 30575 1726867610.84366: results queue empty 30575 1726867610.84367: checking for any_errors_fatal 30575 1726867610.84374: done checking for any_errors_fatal 30575 1726867610.84375: checking for max_fail_percentage 30575 1726867610.84376: done checking for max_fail_percentage 30575 1726867610.84379: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.84380: done checking to see if all hosts have failed 30575 1726867610.84381: getting the remaining hosts for this loop 30575 1726867610.84382: done getting the remaining hosts for this loop 30575 1726867610.84386: getting the next task for host managed_node3 30575 1726867610.84510: done getting next task for host managed_node3 30575 1726867610.84513: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867610.84517: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.84521: getting variables 30575 1726867610.84522: in VariableManager get_vars() 30575 1726867610.84548: Calling all_inventory to load vars for managed_node3 30575 1726867610.84550: Calling groups_inventory to load vars for managed_node3 30575 1726867610.84557: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.84567: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.84570: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.84572: Calling groups_plugins_play to load vars for managed_node3 30575 1726867610.86582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867610.89397: done with get_vars() 30575 1726867610.89422: done getting variables 30575 1726867610.89475: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867610.89711: variable 'profile' from source: play vars 30575 1726867610.89715: variable 'interface' from source: play vars 30575 1726867610.89820: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:26:50 -0400 (0:00:00.072) 0:00:46.276 ****** 30575 1726867610.89900: entering _queue_task() for managed_node3/set_fact 30575 1726867610.90391: worker is 1 (out of 1 available) 30575 1726867610.90404: exiting _queue_task() for managed_node3/set_fact 30575 1726867610.90421: done queuing things up, now waiting for results queue to drain 30575 1726867610.90423: waiting for pending results... 30575 1726867610.90733: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30575 1726867610.90819: in run() - task 0affcac9-a3a5-e081-a588-000000000f1a 30575 1726867610.90845: variable 'ansible_search_path' from source: unknown 30575 1726867610.90854: variable 'ansible_search_path' from source: unknown 30575 1726867610.90938: calling self._execute() 30575 1726867610.91004: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.91015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.91028: variable 'omit' from source: magic vars 30575 1726867610.91402: variable 'ansible_distribution_major_version' from source: facts 30575 1726867610.91420: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867610.91563: variable 'profile_stat' from source: set_fact 30575 1726867610.91571: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867610.91574: when evaluation is False, skipping this task 30575 1726867610.91582: _execute() done 30575 1726867610.91585: dumping result to json 30575 1726867610.91588: done dumping result, returning 30575 1726867610.91596: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000f1a] 30575 1726867610.91601: sending task result for task 0affcac9-a3a5-e081-a588-000000000f1a 30575 1726867610.91701: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f1a 30575 1726867610.91710: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867610.91759: no more pending results, returning what we have 30575 1726867610.91763: results queue empty 30575 1726867610.91764: checking for any_errors_fatal 30575 1726867610.91771: done checking for any_errors_fatal 30575 1726867610.91771: checking for max_fail_percentage 30575 1726867610.91773: done checking for max_fail_percentage 30575 1726867610.91774: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.91775: done checking to see if all hosts have failed 30575 1726867610.91776: getting the remaining hosts for this loop 30575 1726867610.91779: done getting the remaining hosts for this loop 30575 1726867610.91784: getting the next task for host managed_node3 30575 1726867610.91791: done getting next task for host managed_node3 30575 1726867610.91793: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30575 1726867610.91798: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.91803: getting variables 30575 1726867610.91806: in VariableManager get_vars() 30575 1726867610.91839: Calling all_inventory to load vars for managed_node3 30575 1726867610.91841: Calling groups_inventory to load vars for managed_node3 30575 1726867610.91844: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.91854: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.91856: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.91858: Calling groups_plugins_play to load vars for managed_node3 30575 1726867610.93062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867610.94381: done with get_vars() 30575 1726867610.94396: done getting variables 30575 1726867610.94437: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867610.94515: variable 'profile' from source: play vars 30575 1726867610.94518: variable 'interface' from source: play vars 30575 1726867610.94557: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:26:50 -0400 (0:00:00.046) 0:00:46.323 ****** 30575 1726867610.94582: entering _queue_task() for managed_node3/command 30575 1726867610.94808: worker is 1 (out of 1 available) 30575 1726867610.94822: exiting _queue_task() for managed_node3/command 30575 1726867610.94835: done queuing things up, now waiting for results queue to drain 30575 1726867610.94837: waiting for pending results... 30575 1726867610.95020: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30575 1726867610.95097: in run() - task 0affcac9-a3a5-e081-a588-000000000f1b 30575 1726867610.95109: variable 'ansible_search_path' from source: unknown 30575 1726867610.95113: variable 'ansible_search_path' from source: unknown 30575 1726867610.95142: calling self._execute() 30575 1726867610.95212: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.95215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.95229: variable 'omit' from source: magic vars 30575 1726867610.95490: variable 'ansible_distribution_major_version' from source: facts 30575 1726867610.95503: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867610.95584: variable 'profile_stat' from source: set_fact 30575 1726867610.95593: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867610.95596: when evaluation is False, skipping this task 30575 1726867610.95598: _execute() done 30575 1726867610.95604: dumping result to json 30575 1726867610.95607: done dumping result, returning 30575 1726867610.95618: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000f1b] 30575 1726867610.95621: sending task result for task 0affcac9-a3a5-e081-a588-000000000f1b 30575 1726867610.95701: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f1b 30575 1726867610.95704: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867610.95768: no more pending results, returning what we have 30575 1726867610.95772: results queue empty 30575 1726867610.95773: checking for any_errors_fatal 30575 1726867610.95784: done checking for any_errors_fatal 30575 1726867610.95785: checking for max_fail_percentage 30575 1726867610.95786: done checking for max_fail_percentage 30575 1726867610.95787: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.95788: done checking to see if all hosts have failed 30575 1726867610.95789: getting the remaining hosts for this loop 30575 1726867610.95790: done getting the remaining hosts for this loop 30575 1726867610.95794: getting the next task for host managed_node3 30575 1726867610.95801: done getting next task for host managed_node3 30575 1726867610.95803: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30575 1726867610.95807: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.95810: getting variables 30575 1726867610.95812: in VariableManager get_vars() 30575 1726867610.95842: Calling all_inventory to load vars for managed_node3 30575 1726867610.95844: Calling groups_inventory to load vars for managed_node3 30575 1726867610.95848: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.95857: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.95860: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.95862: Calling groups_plugins_play to load vars for managed_node3 30575 1726867610.97111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867610.98206: done with get_vars() 30575 1726867610.98224: done getting variables 30575 1726867610.98267: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867610.98346: variable 'profile' from source: play vars 30575 1726867610.98350: variable 'interface' from source: play vars 30575 1726867610.98391: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:26:50 -0400 (0:00:00.038) 0:00:46.361 ****** 30575 1726867610.98415: entering _queue_task() for managed_node3/set_fact 30575 1726867610.98657: worker is 1 (out of 1 available) 30575 1726867610.98669: exiting _queue_task() for managed_node3/set_fact 30575 1726867610.98683: done queuing things up, now waiting for results queue to drain 30575 1726867610.98685: waiting for pending results... 30575 1726867610.98869: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30575 1726867610.98956: in run() - task 0affcac9-a3a5-e081-a588-000000000f1c 30575 1726867610.98966: variable 'ansible_search_path' from source: unknown 30575 1726867610.98969: variable 'ansible_search_path' from source: unknown 30575 1726867610.99007: calling self._execute() 30575 1726867610.99078: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867610.99086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867610.99095: variable 'omit' from source: magic vars 30575 1726867610.99438: variable 'ansible_distribution_major_version' from source: facts 30575 1726867610.99442: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867610.99689: variable 'profile_stat' from source: set_fact 30575 1726867610.99692: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867610.99693: when evaluation is False, skipping this task 30575 1726867610.99696: _execute() done 30575 1726867610.99697: dumping result to json 30575 1726867610.99699: done dumping result, returning 30575 1726867610.99701: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000000f1c] 30575 1726867610.99704: sending task result for task 0affcac9-a3a5-e081-a588-000000000f1c 30575 1726867610.99763: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f1c 30575 1726867610.99766: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867610.99827: no more pending results, returning what we have 30575 1726867610.99831: results queue empty 30575 1726867610.99831: checking for any_errors_fatal 30575 1726867610.99837: done checking for any_errors_fatal 30575 1726867610.99838: checking for max_fail_percentage 30575 1726867610.99839: done checking for max_fail_percentage 30575 1726867610.99840: checking to see if all hosts have failed and the running result is not ok 30575 1726867610.99841: done checking to see if all hosts have failed 30575 1726867610.99842: getting the remaining hosts for this loop 30575 1726867610.99843: done getting the remaining hosts for this loop 30575 1726867610.99846: getting the next task for host managed_node3 30575 1726867610.99853: done getting next task for host managed_node3 30575 1726867610.99856: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30575 1726867610.99859: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867610.99862: getting variables 30575 1726867610.99863: in VariableManager get_vars() 30575 1726867610.99891: Calling all_inventory to load vars for managed_node3 30575 1726867610.99894: Calling groups_inventory to load vars for managed_node3 30575 1726867610.99896: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867610.99905: Calling all_plugins_play to load vars for managed_node3 30575 1726867610.99907: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867610.99910: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.01117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.02078: done with get_vars() 30575 1726867611.02096: done getting variables 30575 1726867611.02138: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867611.02216: variable 'profile' from source: play vars 30575 1726867611.02220: variable 'interface' from source: play vars 30575 1726867611.02258: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:26:51 -0400 (0:00:00.038) 0:00:46.400 ****** 30575 1726867611.02283: entering _queue_task() for managed_node3/assert 30575 1726867611.02502: worker is 1 (out of 1 available) 30575 1726867611.02514: exiting _queue_task() for managed_node3/assert 30575 1726867611.02527: done queuing things up, now waiting for results queue to drain 30575 1726867611.02529: waiting for pending results... 30575 1726867611.02711: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' 30575 1726867611.02788: in run() - task 0affcac9-a3a5-e081-a588-000000000e8c 30575 1726867611.02799: variable 'ansible_search_path' from source: unknown 30575 1726867611.02804: variable 'ansible_search_path' from source: unknown 30575 1726867611.02834: calling self._execute() 30575 1726867611.02903: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.02907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.02915: variable 'omit' from source: magic vars 30575 1726867611.03232: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.03235: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.03238: variable 'omit' from source: magic vars 30575 1726867611.03342: variable 'omit' from source: magic vars 30575 1726867611.03584: variable 'profile' from source: play vars 30575 1726867611.03588: variable 'interface' from source: play vars 30575 1726867611.03590: variable 'interface' from source: play vars 30575 1726867611.03592: variable 'omit' from source: magic vars 30575 1726867611.03594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867611.03597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867611.03599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867611.03602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.03604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.03607: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867611.03608: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.03610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.03700: Set connection var ansible_pipelining to False 30575 1726867611.03703: Set connection var ansible_shell_type to sh 30575 1726867611.03709: Set connection var ansible_shell_executable to /bin/sh 30575 1726867611.03714: Set connection var ansible_timeout to 10 30575 1726867611.03727: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867611.03730: Set connection var ansible_connection to ssh 30575 1726867611.03749: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.03751: variable 'ansible_connection' from source: unknown 30575 1726867611.03754: variable 'ansible_module_compression' from source: unknown 30575 1726867611.03756: variable 'ansible_shell_type' from source: unknown 30575 1726867611.03758: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.03760: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.03765: variable 'ansible_pipelining' from source: unknown 30575 1726867611.03767: variable 'ansible_timeout' from source: unknown 30575 1726867611.03770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.03946: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867611.03954: variable 'omit' from source: magic vars 30575 1726867611.03956: starting attempt loop 30575 1726867611.03959: running the handler 30575 1726867611.04021: variable 'lsr_net_profile_exists' from source: set_fact 30575 1726867611.04024: Evaluated conditional (lsr_net_profile_exists): True 30575 1726867611.04027: handler run complete 30575 1726867611.04059: attempt loop complete, returning result 30575 1726867611.04063: _execute() done 30575 1726867611.04065: dumping result to json 30575 1726867611.04067: done dumping result, returning 30575 1726867611.04069: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' [0affcac9-a3a5-e081-a588-000000000e8c] 30575 1726867611.04071: sending task result for task 0affcac9-a3a5-e081-a588-000000000e8c 30575 1726867611.04151: done sending task result for task 0affcac9-a3a5-e081-a588-000000000e8c 30575 1726867611.04154: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867611.04210: no more pending results, returning what we have 30575 1726867611.04213: results queue empty 30575 1726867611.04213: checking for any_errors_fatal 30575 1726867611.04221: done checking for any_errors_fatal 30575 1726867611.04221: checking for max_fail_percentage 30575 1726867611.04223: done checking for max_fail_percentage 30575 1726867611.04224: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.04225: done checking to see if all hosts have failed 30575 1726867611.04225: getting the remaining hosts for this loop 30575 1726867611.04227: done getting the remaining hosts for this loop 30575 1726867611.04230: getting the next task for host managed_node3 30575 1726867611.04237: done getting next task for host managed_node3 30575 1726867611.04239: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30575 1726867611.04242: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.04246: getting variables 30575 1726867611.04247: in VariableManager get_vars() 30575 1726867611.04356: Calling all_inventory to load vars for managed_node3 30575 1726867611.04359: Calling groups_inventory to load vars for managed_node3 30575 1726867611.04362: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.04371: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.04374: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.04379: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.09535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.10852: done with get_vars() 30575 1726867611.10875: done getting variables 30575 1726867611.10921: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867611.11001: variable 'profile' from source: play vars 30575 1726867611.11003: variable 'interface' from source: play vars 30575 1726867611.11047: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:26:51 -0400 (0:00:00.087) 0:00:46.488 ****** 30575 1726867611.11082: entering _queue_task() for managed_node3/assert 30575 1726867611.11405: worker is 1 (out of 1 available) 30575 1726867611.11421: exiting _queue_task() for managed_node3/assert 30575 1726867611.11432: done queuing things up, now waiting for results queue to drain 30575 1726867611.11436: waiting for pending results... 30575 1726867611.11659: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' 30575 1726867611.11786: in run() - task 0affcac9-a3a5-e081-a588-000000000e8d 30575 1726867611.11792: variable 'ansible_search_path' from source: unknown 30575 1726867611.11796: variable 'ansible_search_path' from source: unknown 30575 1726867611.11829: calling self._execute() 30575 1726867611.11898: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.11902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.11911: variable 'omit' from source: magic vars 30575 1726867611.12238: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.12247: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.12254: variable 'omit' from source: magic vars 30575 1726867611.12292: variable 'omit' from source: magic vars 30575 1726867611.12358: variable 'profile' from source: play vars 30575 1726867611.12361: variable 'interface' from source: play vars 30575 1726867611.12413: variable 'interface' from source: play vars 30575 1726867611.12430: variable 'omit' from source: magic vars 30575 1726867611.12463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867611.12493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867611.12508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867611.12524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.12534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.12558: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867611.12561: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.12564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.12636: Set connection var ansible_pipelining to False 30575 1726867611.12640: Set connection var ansible_shell_type to sh 30575 1726867611.12643: Set connection var ansible_shell_executable to /bin/sh 30575 1726867611.12649: Set connection var ansible_timeout to 10 30575 1726867611.12654: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867611.12660: Set connection var ansible_connection to ssh 30575 1726867611.12679: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.12682: variable 'ansible_connection' from source: unknown 30575 1726867611.12685: variable 'ansible_module_compression' from source: unknown 30575 1726867611.12688: variable 'ansible_shell_type' from source: unknown 30575 1726867611.12691: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.12693: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.12697: variable 'ansible_pipelining' from source: unknown 30575 1726867611.12700: variable 'ansible_timeout' from source: unknown 30575 1726867611.12703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.12803: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867611.12813: variable 'omit' from source: magic vars 30575 1726867611.12822: starting attempt loop 30575 1726867611.12825: running the handler 30575 1726867611.12895: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30575 1726867611.12899: Evaluated conditional (lsr_net_profile_ansible_managed): True 30575 1726867611.12904: handler run complete 30575 1726867611.12915: attempt loop complete, returning result 30575 1726867611.12918: _execute() done 30575 1726867611.12924: dumping result to json 30575 1726867611.12927: done dumping result, returning 30575 1726867611.12937: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcac9-a3a5-e081-a588-000000000e8d] 30575 1726867611.12939: sending task result for task 0affcac9-a3a5-e081-a588-000000000e8d 30575 1726867611.13021: done sending task result for task 0affcac9-a3a5-e081-a588-000000000e8d 30575 1726867611.13023: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867611.13083: no more pending results, returning what we have 30575 1726867611.13087: results queue empty 30575 1726867611.13087: checking for any_errors_fatal 30575 1726867611.13096: done checking for any_errors_fatal 30575 1726867611.13097: checking for max_fail_percentage 30575 1726867611.13099: done checking for max_fail_percentage 30575 1726867611.13100: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.13101: done checking to see if all hosts have failed 30575 1726867611.13101: getting the remaining hosts for this loop 30575 1726867611.13103: done getting the remaining hosts for this loop 30575 1726867611.13106: getting the next task for host managed_node3 30575 1726867611.13114: done getting next task for host managed_node3 30575 1726867611.13116: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30575 1726867611.13119: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.13125: getting variables 30575 1726867611.13127: in VariableManager get_vars() 30575 1726867611.13161: Calling all_inventory to load vars for managed_node3 30575 1726867611.13163: Calling groups_inventory to load vars for managed_node3 30575 1726867611.13167: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.13176: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.13181: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.13183: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.13961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.15260: done with get_vars() 30575 1726867611.15284: done getting variables 30575 1726867611.15359: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867611.15462: variable 'profile' from source: play vars 30575 1726867611.15466: variable 'interface' from source: play vars 30575 1726867611.15542: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:26:51 -0400 (0:00:00.044) 0:00:46.533 ****** 30575 1726867611.15584: entering _queue_task() for managed_node3/assert 30575 1726867611.15907: worker is 1 (out of 1 available) 30575 1726867611.15922: exiting _queue_task() for managed_node3/assert 30575 1726867611.15936: done queuing things up, now waiting for results queue to drain 30575 1726867611.15938: waiting for pending results... 30575 1726867611.16182: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr 30575 1726867611.16303: in run() - task 0affcac9-a3a5-e081-a588-000000000e8e 30575 1726867611.16327: variable 'ansible_search_path' from source: unknown 30575 1726867611.16333: variable 'ansible_search_path' from source: unknown 30575 1726867611.16396: calling self._execute() 30575 1726867611.16481: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.16486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.16495: variable 'omit' from source: magic vars 30575 1726867611.16896: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.16907: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.16913: variable 'omit' from source: magic vars 30575 1726867611.16960: variable 'omit' from source: magic vars 30575 1726867611.17105: variable 'profile' from source: play vars 30575 1726867611.17109: variable 'interface' from source: play vars 30575 1726867611.17164: variable 'interface' from source: play vars 30575 1726867611.17194: variable 'omit' from source: magic vars 30575 1726867611.17226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867611.17275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867611.17304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867611.17330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.17333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.17368: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867611.17372: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.17382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.17456: Set connection var ansible_pipelining to False 30575 1726867611.17459: Set connection var ansible_shell_type to sh 30575 1726867611.17472: Set connection var ansible_shell_executable to /bin/sh 30575 1726867611.17475: Set connection var ansible_timeout to 10 30575 1726867611.17479: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867611.17482: Set connection var ansible_connection to ssh 30575 1726867611.17506: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.17509: variable 'ansible_connection' from source: unknown 30575 1726867611.17511: variable 'ansible_module_compression' from source: unknown 30575 1726867611.17514: variable 'ansible_shell_type' from source: unknown 30575 1726867611.17516: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.17519: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.17527: variable 'ansible_pipelining' from source: unknown 30575 1726867611.17529: variable 'ansible_timeout' from source: unknown 30575 1726867611.17532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.17646: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867611.17655: variable 'omit' from source: magic vars 30575 1726867611.17660: starting attempt loop 30575 1726867611.17663: running the handler 30575 1726867611.17746: variable 'lsr_net_profile_fingerprint' from source: set_fact 30575 1726867611.17749: Evaluated conditional (lsr_net_profile_fingerprint): True 30575 1726867611.17752: handler run complete 30575 1726867611.17763: attempt loop complete, returning result 30575 1726867611.17766: _execute() done 30575 1726867611.17769: dumping result to json 30575 1726867611.17771: done dumping result, returning 30575 1726867611.17779: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr [0affcac9-a3a5-e081-a588-000000000e8e] 30575 1726867611.17784: sending task result for task 0affcac9-a3a5-e081-a588-000000000e8e 30575 1726867611.17870: done sending task result for task 0affcac9-a3a5-e081-a588-000000000e8e 30575 1726867611.17872: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867611.17925: no more pending results, returning what we have 30575 1726867611.17928: results queue empty 30575 1726867611.17929: checking for any_errors_fatal 30575 1726867611.17939: done checking for any_errors_fatal 30575 1726867611.17940: checking for max_fail_percentage 30575 1726867611.17941: done checking for max_fail_percentage 30575 1726867611.17943: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.17944: done checking to see if all hosts have failed 30575 1726867611.17944: getting the remaining hosts for this loop 30575 1726867611.17946: done getting the remaining hosts for this loop 30575 1726867611.17950: getting the next task for host managed_node3 30575 1726867611.17959: done getting next task for host managed_node3 30575 1726867611.17962: ^ task is: TASK: Conditional asserts 30575 1726867611.17965: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.17969: getting variables 30575 1726867611.17971: in VariableManager get_vars() 30575 1726867611.18014: Calling all_inventory to load vars for managed_node3 30575 1726867611.18017: Calling groups_inventory to load vars for managed_node3 30575 1726867611.18021: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.18031: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.18033: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.18036: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.19024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.19959: done with get_vars() 30575 1726867611.19978: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:26:51 -0400 (0:00:00.044) 0:00:46.578 ****** 30575 1726867611.20045: entering _queue_task() for managed_node3/include_tasks 30575 1726867611.20310: worker is 1 (out of 1 available) 30575 1726867611.20326: exiting _queue_task() for managed_node3/include_tasks 30575 1726867611.20339: done queuing things up, now waiting for results queue to drain 30575 1726867611.20340: waiting for pending results... 30575 1726867611.20564: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30575 1726867611.20682: in run() - task 0affcac9-a3a5-e081-a588-000000000a4f 30575 1726867611.20716: variable 'ansible_search_path' from source: unknown 30575 1726867611.20719: variable 'ansible_search_path' from source: unknown 30575 1726867611.20927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867611.22859: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867611.22906: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867611.22950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867611.22975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867611.22997: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867611.23093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867611.23118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867611.23139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867611.23183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867611.23200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867611.23309: dumping result to json 30575 1726867611.23313: done dumping result, returning 30575 1726867611.23318: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-e081-a588-000000000a4f] 30575 1726867611.23326: sending task result for task 0affcac9-a3a5-e081-a588-000000000a4f 30575 1726867611.23425: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a4f 30575 1726867611.23429: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 30575 1726867611.23473: no more pending results, returning what we have 30575 1726867611.23483: results queue empty 30575 1726867611.23484: checking for any_errors_fatal 30575 1726867611.23491: done checking for any_errors_fatal 30575 1726867611.23492: checking for max_fail_percentage 30575 1726867611.23493: done checking for max_fail_percentage 30575 1726867611.23494: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.23495: done checking to see if all hosts have failed 30575 1726867611.23496: getting the remaining hosts for this loop 30575 1726867611.23497: done getting the remaining hosts for this loop 30575 1726867611.23502: getting the next task for host managed_node3 30575 1726867611.23509: done getting next task for host managed_node3 30575 1726867611.23512: ^ task is: TASK: Success in test '{{ lsr_description }}' 30575 1726867611.23515: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.23518: getting variables 30575 1726867611.23520: in VariableManager get_vars() 30575 1726867611.23553: Calling all_inventory to load vars for managed_node3 30575 1726867611.23556: Calling groups_inventory to load vars for managed_node3 30575 1726867611.23559: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.23568: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.23571: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.23573: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.24512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.25352: done with get_vars() 30575 1726867611.25367: done getting variables 30575 1726867611.25410: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867611.25496: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:26:51 -0400 (0:00:00.054) 0:00:46.632 ****** 30575 1726867611.25519: entering _queue_task() for managed_node3/debug 30575 1726867611.25756: worker is 1 (out of 1 available) 30575 1726867611.25771: exiting _queue_task() for managed_node3/debug 30575 1726867611.25785: done queuing things up, now waiting for results queue to drain 30575 1726867611.25787: waiting for pending results... 30575 1726867611.25966: running TaskExecutor() for managed_node3/TASK: Success in test 'I can activate an existing profile' 30575 1726867611.26047: in run() - task 0affcac9-a3a5-e081-a588-000000000a50 30575 1726867611.26059: variable 'ansible_search_path' from source: unknown 30575 1726867611.26064: variable 'ansible_search_path' from source: unknown 30575 1726867611.26093: calling self._execute() 30575 1726867611.26164: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.26168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.26180: variable 'omit' from source: magic vars 30575 1726867611.26447: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.26459: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.26463: variable 'omit' from source: magic vars 30575 1726867611.26494: variable 'omit' from source: magic vars 30575 1726867611.26562: variable 'lsr_description' from source: include params 30575 1726867611.26576: variable 'omit' from source: magic vars 30575 1726867611.26611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867611.26639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867611.26656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867611.26668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.26683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.26705: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867611.26709: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.26711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.26779: Set connection var ansible_pipelining to False 30575 1726867611.26782: Set connection var ansible_shell_type to sh 30575 1726867611.26793: Set connection var ansible_shell_executable to /bin/sh 30575 1726867611.26796: Set connection var ansible_timeout to 10 30575 1726867611.26798: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867611.26805: Set connection var ansible_connection to ssh 30575 1726867611.26826: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.26829: variable 'ansible_connection' from source: unknown 30575 1726867611.26832: variable 'ansible_module_compression' from source: unknown 30575 1726867611.26834: variable 'ansible_shell_type' from source: unknown 30575 1726867611.26837: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.26839: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.26841: variable 'ansible_pipelining' from source: unknown 30575 1726867611.26844: variable 'ansible_timeout' from source: unknown 30575 1726867611.26846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.26949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867611.26959: variable 'omit' from source: magic vars 30575 1726867611.26963: starting attempt loop 30575 1726867611.26966: running the handler 30575 1726867611.27004: handler run complete 30575 1726867611.27014: attempt loop complete, returning result 30575 1726867611.27017: _execute() done 30575 1726867611.27021: dumping result to json 30575 1726867611.27026: done dumping result, returning 30575 1726867611.27032: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can activate an existing profile' [0affcac9-a3a5-e081-a588-000000000a50] 30575 1726867611.27037: sending task result for task 0affcac9-a3a5-e081-a588-000000000a50 30575 1726867611.27112: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a50 30575 1726867611.27117: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 30575 1726867611.27158: no more pending results, returning what we have 30575 1726867611.27162: results queue empty 30575 1726867611.27163: checking for any_errors_fatal 30575 1726867611.27170: done checking for any_errors_fatal 30575 1726867611.27170: checking for max_fail_percentage 30575 1726867611.27172: done checking for max_fail_percentage 30575 1726867611.27173: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.27174: done checking to see if all hosts have failed 30575 1726867611.27174: getting the remaining hosts for this loop 30575 1726867611.27176: done getting the remaining hosts for this loop 30575 1726867611.27181: getting the next task for host managed_node3 30575 1726867611.27189: done getting next task for host managed_node3 30575 1726867611.27191: ^ task is: TASK: Cleanup 30575 1726867611.27194: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.27198: getting variables 30575 1726867611.27200: in VariableManager get_vars() 30575 1726867611.27233: Calling all_inventory to load vars for managed_node3 30575 1726867611.27235: Calling groups_inventory to load vars for managed_node3 30575 1726867611.27238: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.27247: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.27250: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.27252: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.28043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.29532: done with get_vars() 30575 1726867611.29546: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:26:51 -0400 (0:00:00.040) 0:00:46.673 ****** 30575 1726867611.29610: entering _queue_task() for managed_node3/include_tasks 30575 1726867611.29856: worker is 1 (out of 1 available) 30575 1726867611.29867: exiting _queue_task() for managed_node3/include_tasks 30575 1726867611.30084: done queuing things up, now waiting for results queue to drain 30575 1726867611.30086: waiting for pending results... 30575 1726867611.30299: running TaskExecutor() for managed_node3/TASK: Cleanup 30575 1726867611.30304: in run() - task 0affcac9-a3a5-e081-a588-000000000a54 30575 1726867611.30308: variable 'ansible_search_path' from source: unknown 30575 1726867611.30312: variable 'ansible_search_path' from source: unknown 30575 1726867611.30336: variable 'lsr_cleanup' from source: include params 30575 1726867611.30626: variable 'lsr_cleanup' from source: include params 30575 1726867611.30630: variable 'omit' from source: magic vars 30575 1726867611.30732: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.30741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.30756: variable 'omit' from source: magic vars 30575 1726867611.30987: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.30995: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.31001: variable 'item' from source: unknown 30575 1726867611.31063: variable 'item' from source: unknown 30575 1726867611.31096: variable 'item' from source: unknown 30575 1726867611.31156: variable 'item' from source: unknown 30575 1726867611.31394: dumping result to json 30575 1726867611.31397: done dumping result, returning 30575 1726867611.31399: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-e081-a588-000000000a54] 30575 1726867611.31401: sending task result for task 0affcac9-a3a5-e081-a588-000000000a54 30575 1726867611.31439: done sending task result for task 0affcac9-a3a5-e081-a588-000000000a54 30575 1726867611.31442: WORKER PROCESS EXITING 30575 1726867611.31520: no more pending results, returning what we have 30575 1726867611.31524: in VariableManager get_vars() 30575 1726867611.31555: Calling all_inventory to load vars for managed_node3 30575 1726867611.31557: Calling groups_inventory to load vars for managed_node3 30575 1726867611.31560: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.31569: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.31572: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.31575: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.32520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.33382: done with get_vars() 30575 1726867611.33394: variable 'ansible_search_path' from source: unknown 30575 1726867611.33395: variable 'ansible_search_path' from source: unknown 30575 1726867611.33420: we have included files to process 30575 1726867611.33420: generating all_blocks data 30575 1726867611.33422: done generating all_blocks data 30575 1726867611.33425: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867611.33426: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867611.33427: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867611.33550: done processing included file 30575 1726867611.33552: iterating over new_blocks loaded from include file 30575 1726867611.33553: in VariableManager get_vars() 30575 1726867611.33564: done with get_vars() 30575 1726867611.33565: filtering new block on tags 30575 1726867611.33596: done filtering new block on tags 30575 1726867611.33598: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30575 1726867611.33606: extending task lists for all hosts with included blocks 30575 1726867611.34731: done extending task lists 30575 1726867611.34732: done processing included files 30575 1726867611.34733: results queue empty 30575 1726867611.34734: checking for any_errors_fatal 30575 1726867611.34736: done checking for any_errors_fatal 30575 1726867611.34737: checking for max_fail_percentage 30575 1726867611.34738: done checking for max_fail_percentage 30575 1726867611.34739: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.34740: done checking to see if all hosts have failed 30575 1726867611.34740: getting the remaining hosts for this loop 30575 1726867611.34742: done getting the remaining hosts for this loop 30575 1726867611.34744: getting the next task for host managed_node3 30575 1726867611.34748: done getting next task for host managed_node3 30575 1726867611.34750: ^ task is: TASK: Cleanup profile and device 30575 1726867611.34753: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.34755: getting variables 30575 1726867611.34756: in VariableManager get_vars() 30575 1726867611.34765: Calling all_inventory to load vars for managed_node3 30575 1726867611.34767: Calling groups_inventory to load vars for managed_node3 30575 1726867611.34770: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.34775: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.34779: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.34782: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.35862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.37430: done with get_vars() 30575 1726867611.37448: done getting variables 30575 1726867611.37493: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 17:26:51 -0400 (0:00:00.079) 0:00:46.752 ****** 30575 1726867611.37523: entering _queue_task() for managed_node3/shell 30575 1726867611.37858: worker is 1 (out of 1 available) 30575 1726867611.37870: exiting _queue_task() for managed_node3/shell 30575 1726867611.37884: done queuing things up, now waiting for results queue to drain 30575 1726867611.37885: waiting for pending results... 30575 1726867611.38298: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30575 1726867611.38303: in run() - task 0affcac9-a3a5-e081-a588-000000000f6d 30575 1726867611.38314: variable 'ansible_search_path' from source: unknown 30575 1726867611.38321: variable 'ansible_search_path' from source: unknown 30575 1726867611.38359: calling self._execute() 30575 1726867611.38468: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.38483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.38503: variable 'omit' from source: magic vars 30575 1726867611.38871: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.38891: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.38902: variable 'omit' from source: magic vars 30575 1726867611.39156: variable 'omit' from source: magic vars 30575 1726867611.39310: variable 'interface' from source: play vars 30575 1726867611.39339: variable 'omit' from source: magic vars 30575 1726867611.39402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867611.39483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867611.39487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867611.39507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.39530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.39571: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867611.39574: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.39579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.39710: Set connection var ansible_pipelining to False 30575 1726867611.39713: Set connection var ansible_shell_type to sh 30575 1726867611.39718: Set connection var ansible_shell_executable to /bin/sh 30575 1726867611.39728: Set connection var ansible_timeout to 10 30575 1726867611.39733: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867611.39741: Set connection var ansible_connection to ssh 30575 1726867611.39788: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.39792: variable 'ansible_connection' from source: unknown 30575 1726867611.39795: variable 'ansible_module_compression' from source: unknown 30575 1726867611.39798: variable 'ansible_shell_type' from source: unknown 30575 1726867611.39800: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.39807: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.39810: variable 'ansible_pipelining' from source: unknown 30575 1726867611.39812: variable 'ansible_timeout' from source: unknown 30575 1726867611.39814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.40027: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867611.40036: variable 'omit' from source: magic vars 30575 1726867611.40042: starting attempt loop 30575 1726867611.40045: running the handler 30575 1726867611.40055: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867611.40136: _low_level_execute_command(): starting 30575 1726867611.40139: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867611.40904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867611.41005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867611.41021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867611.41043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867611.41202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867611.42879: stdout chunk (state=3): >>>/root <<< 30575 1726867611.43153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867611.43157: stdout chunk (state=3): >>><<< 30575 1726867611.43159: stderr chunk (state=3): >>><<< 30575 1726867611.43162: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867611.43165: _low_level_execute_command(): starting 30575 1726867611.43167: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023 `" && echo ansible-tmp-1726867611.4305985-32876-191115562034023="` echo /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023 `" ) && sleep 0' 30575 1726867611.43894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867611.43947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867611.45832: stdout chunk (state=3): >>>ansible-tmp-1726867611.4305985-32876-191115562034023=/root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023 <<< 30575 1726867611.45963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867611.45975: stdout chunk (state=3): >>><<< 30575 1726867611.45981: stderr chunk (state=3): >>><<< 30575 1726867611.45983: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867611.4305985-32876-191115562034023=/root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867611.46010: variable 'ansible_module_compression' from source: unknown 30575 1726867611.46051: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867611.46089: variable 'ansible_facts' from source: unknown 30575 1726867611.46144: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/AnsiballZ_command.py 30575 1726867611.46242: Sending initial data 30575 1726867611.46246: Sent initial data (156 bytes) 30575 1726867611.46804: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867611.46828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867611.46903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867611.48430: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867611.48441: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867611.48473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867611.48512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpdkhjfj_n /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/AnsiballZ_command.py <<< 30575 1726867611.48521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/AnsiballZ_command.py" <<< 30575 1726867611.48561: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpdkhjfj_n" to remote "/root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/AnsiballZ_command.py" <<< 30575 1726867611.49096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867611.49138: stderr chunk (state=3): >>><<< 30575 1726867611.49142: stdout chunk (state=3): >>><<< 30575 1726867611.49167: done transferring module to remote 30575 1726867611.49170: _low_level_execute_command(): starting 30575 1726867611.49185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/ /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/AnsiballZ_command.py && sleep 0' 30575 1726867611.49797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867611.49841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867611.49845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867611.49911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867611.51642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867611.51646: stderr chunk (state=3): >>><<< 30575 1726867611.51649: stdout chunk (state=3): >>><<< 30575 1726867611.51663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867611.51670: _low_level_execute_command(): starting 30575 1726867611.51672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/AnsiballZ_command.py && sleep 0' 30575 1726867611.52069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867611.52073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867611.52075: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867611.52079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867611.52127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867611.52132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867611.52186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867611.72758: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (ade586ae-171f-45bd-a4ea-cde3464255eb) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:26:51.670578", "end": "2024-09-20 17:26:51.724013", "delta": "0:00:00.053435", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867611.75386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867611.75391: stderr chunk (state=3): >>><<< 30575 1726867611.75393: stdout chunk (state=3): >>><<< 30575 1726867611.75429: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (ade586ae-171f-45bd-a4ea-cde3464255eb) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:26:51.670578", "end": "2024-09-20 17:26:51.724013", "delta": "0:00:00.053435", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867611.75456: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867611.75465: _low_level_execute_command(): starting 30575 1726867611.75482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867611.4305985-32876-191115562034023/ > /dev/null 2>&1 && sleep 0' 30575 1726867611.76393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867611.76397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867611.76399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867611.76402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867611.76511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867611.76599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867611.78429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867611.78451: stderr chunk (state=3): >>><<< 30575 1726867611.78458: stdout chunk (state=3): >>><<< 30575 1726867611.78476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867611.78485: handler run complete 30575 1726867611.78519: Evaluated conditional (False): False 30575 1726867611.78523: attempt loop complete, returning result 30575 1726867611.78525: _execute() done 30575 1726867611.78528: dumping result to json 30575 1726867611.78591: done dumping result, returning 30575 1726867611.78594: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcac9-a3a5-e081-a588-000000000f6d] 30575 1726867611.78597: sending task result for task 0affcac9-a3a5-e081-a588-000000000f6d 30575 1726867611.78662: done sending task result for task 0affcac9-a3a5-e081-a588-000000000f6d 30575 1726867611.78664: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.053435", "end": "2024-09-20 17:26:51.724013", "rc": 0, "start": "2024-09-20 17:26:51.670578" } STDOUT: Connection 'statebr' (ade586ae-171f-45bd-a4ea-cde3464255eb) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 30575 1726867611.78738: no more pending results, returning what we have 30575 1726867611.78742: results queue empty 30575 1726867611.78743: checking for any_errors_fatal 30575 1726867611.78744: done checking for any_errors_fatal 30575 1726867611.78745: checking for max_fail_percentage 30575 1726867611.78746: done checking for max_fail_percentage 30575 1726867611.78747: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.78748: done checking to see if all hosts have failed 30575 1726867611.78749: getting the remaining hosts for this loop 30575 1726867611.78750: done getting the remaining hosts for this loop 30575 1726867611.78754: getting the next task for host managed_node3 30575 1726867611.78766: done getting next task for host managed_node3 30575 1726867611.78768: ^ task is: TASK: Include the task 'run_test.yml' 30575 1726867611.78770: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.78774: getting variables 30575 1726867611.78776: in VariableManager get_vars() 30575 1726867611.78811: Calling all_inventory to load vars for managed_node3 30575 1726867611.78814: Calling groups_inventory to load vars for managed_node3 30575 1726867611.78820: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.78830: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.78832: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.78835: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.80787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.85454: done with get_vars() 30575 1726867611.85488: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Friday 20 September 2024 17:26:51 -0400 (0:00:00.480) 0:00:47.233 ****** 30575 1726867611.85607: entering _queue_task() for managed_node3/include_tasks 30575 1726867611.86213: worker is 1 (out of 1 available) 30575 1726867611.86225: exiting _queue_task() for managed_node3/include_tasks 30575 1726867611.86239: done queuing things up, now waiting for results queue to drain 30575 1726867611.86240: waiting for pending results... 30575 1726867611.86766: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30575 1726867611.86810: in run() - task 0affcac9-a3a5-e081-a588-000000000013 30575 1726867611.86825: variable 'ansible_search_path' from source: unknown 30575 1726867611.86855: calling self._execute() 30575 1726867611.86942: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.86945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.86955: variable 'omit' from source: magic vars 30575 1726867611.87305: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.87310: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.87313: _execute() done 30575 1726867611.87318: dumping result to json 30575 1726867611.87320: done dumping result, returning 30575 1726867611.87322: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-e081-a588-000000000013] 30575 1726867611.87324: sending task result for task 0affcac9-a3a5-e081-a588-000000000013 30575 1726867611.87457: done sending task result for task 0affcac9-a3a5-e081-a588-000000000013 30575 1726867611.87460: WORKER PROCESS EXITING 30575 1726867611.87572: no more pending results, returning what we have 30575 1726867611.87580: in VariableManager get_vars() 30575 1726867611.87623: Calling all_inventory to load vars for managed_node3 30575 1726867611.87627: Calling groups_inventory to load vars for managed_node3 30575 1726867611.87631: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.87643: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.87646: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.87650: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.89281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.90248: done with get_vars() 30575 1726867611.90261: variable 'ansible_search_path' from source: unknown 30575 1726867611.90271: we have included files to process 30575 1726867611.90271: generating all_blocks data 30575 1726867611.90273: done generating all_blocks data 30575 1726867611.90276: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867611.90279: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867611.90281: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867611.90632: in VariableManager get_vars() 30575 1726867611.90648: done with get_vars() 30575 1726867611.90690: in VariableManager get_vars() 30575 1726867611.90708: done with get_vars() 30575 1726867611.90746: in VariableManager get_vars() 30575 1726867611.90763: done with get_vars() 30575 1726867611.90805: in VariableManager get_vars() 30575 1726867611.90821: done with get_vars() 30575 1726867611.90862: in VariableManager get_vars() 30575 1726867611.90880: done with get_vars() 30575 1726867611.91246: in VariableManager get_vars() 30575 1726867611.91264: done with get_vars() 30575 1726867611.91276: done processing included file 30575 1726867611.91279: iterating over new_blocks loaded from include file 30575 1726867611.91280: in VariableManager get_vars() 30575 1726867611.91291: done with get_vars() 30575 1726867611.91292: filtering new block on tags 30575 1726867611.91386: done filtering new block on tags 30575 1726867611.91390: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30575 1726867611.91395: extending task lists for all hosts with included blocks 30575 1726867611.91427: done extending task lists 30575 1726867611.91429: done processing included files 30575 1726867611.91429: results queue empty 30575 1726867611.91430: checking for any_errors_fatal 30575 1726867611.91433: done checking for any_errors_fatal 30575 1726867611.91434: checking for max_fail_percentage 30575 1726867611.91435: done checking for max_fail_percentage 30575 1726867611.91436: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.91437: done checking to see if all hosts have failed 30575 1726867611.91438: getting the remaining hosts for this loop 30575 1726867611.91439: done getting the remaining hosts for this loop 30575 1726867611.91441: getting the next task for host managed_node3 30575 1726867611.91445: done getting next task for host managed_node3 30575 1726867611.91447: ^ task is: TASK: TEST: {{ lsr_description }} 30575 1726867611.91449: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.91452: getting variables 30575 1726867611.91452: in VariableManager get_vars() 30575 1726867611.91464: Calling all_inventory to load vars for managed_node3 30575 1726867611.91467: Calling groups_inventory to load vars for managed_node3 30575 1726867611.91469: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.91474: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.91478: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.91481: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.92802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.93909: done with get_vars() 30575 1726867611.93928: done getting variables 30575 1726867611.93963: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867611.94083: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:26:51 -0400 (0:00:00.085) 0:00:47.318 ****** 30575 1726867611.94110: entering _queue_task() for managed_node3/debug 30575 1726867611.94410: worker is 1 (out of 1 available) 30575 1726867611.94422: exiting _queue_task() for managed_node3/debug 30575 1726867611.94435: done queuing things up, now waiting for results queue to drain 30575 1726867611.94437: waiting for pending results... 30575 1726867611.95298: running TaskExecutor() for managed_node3/TASK: TEST: I can remove an existing profile without taking it down 30575 1726867611.95309: in run() - task 0affcac9-a3a5-e081-a588-000000001005 30575 1726867611.95313: variable 'ansible_search_path' from source: unknown 30575 1726867611.95316: variable 'ansible_search_path' from source: unknown 30575 1726867611.95320: calling self._execute() 30575 1726867611.95444: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.95448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.95459: variable 'omit' from source: magic vars 30575 1726867611.95990: variable 'ansible_distribution_major_version' from source: facts 30575 1726867611.95994: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867611.95998: variable 'omit' from source: magic vars 30575 1726867611.96000: variable 'omit' from source: magic vars 30575 1726867611.96035: variable 'lsr_description' from source: include params 30575 1726867611.96059: variable 'omit' from source: magic vars 30575 1726867611.96113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867611.96159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867611.96186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867611.96209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.96237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867611.96272: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867611.96285: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.96293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.96403: Set connection var ansible_pipelining to False 30575 1726867611.96445: Set connection var ansible_shell_type to sh 30575 1726867611.96451: Set connection var ansible_shell_executable to /bin/sh 30575 1726867611.96455: Set connection var ansible_timeout to 10 30575 1726867611.96458: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867611.96460: Set connection var ansible_connection to ssh 30575 1726867611.96491: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.96551: variable 'ansible_connection' from source: unknown 30575 1726867611.96554: variable 'ansible_module_compression' from source: unknown 30575 1726867611.96560: variable 'ansible_shell_type' from source: unknown 30575 1726867611.96563: variable 'ansible_shell_executable' from source: unknown 30575 1726867611.96565: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867611.96568: variable 'ansible_pipelining' from source: unknown 30575 1726867611.96570: variable 'ansible_timeout' from source: unknown 30575 1726867611.96572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867611.96717: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867611.96728: variable 'omit' from source: magic vars 30575 1726867611.96733: starting attempt loop 30575 1726867611.96736: running the handler 30575 1726867611.96780: handler run complete 30575 1726867611.96791: attempt loop complete, returning result 30575 1726867611.96794: _execute() done 30575 1726867611.96797: dumping result to json 30575 1726867611.96799: done dumping result, returning 30575 1726867611.96808: done running TaskExecutor() for managed_node3/TASK: TEST: I can remove an existing profile without taking it down [0affcac9-a3a5-e081-a588-000000001005] 30575 1726867611.96814: sending task result for task 0affcac9-a3a5-e081-a588-000000001005 30575 1726867611.96897: done sending task result for task 0affcac9-a3a5-e081-a588-000000001005 30575 1726867611.96899: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can remove an existing profile without taking it down ########## 30575 1726867611.96946: no more pending results, returning what we have 30575 1726867611.96949: results queue empty 30575 1726867611.96950: checking for any_errors_fatal 30575 1726867611.96951: done checking for any_errors_fatal 30575 1726867611.96951: checking for max_fail_percentage 30575 1726867611.96953: done checking for max_fail_percentage 30575 1726867611.96954: checking to see if all hosts have failed and the running result is not ok 30575 1726867611.96955: done checking to see if all hosts have failed 30575 1726867611.96956: getting the remaining hosts for this loop 30575 1726867611.96957: done getting the remaining hosts for this loop 30575 1726867611.96960: getting the next task for host managed_node3 30575 1726867611.96968: done getting next task for host managed_node3 30575 1726867611.96970: ^ task is: TASK: Show item 30575 1726867611.96972: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867611.96976: getting variables 30575 1726867611.96979: in VariableManager get_vars() 30575 1726867611.97018: Calling all_inventory to load vars for managed_node3 30575 1726867611.97021: Calling groups_inventory to load vars for managed_node3 30575 1726867611.97024: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867611.97034: Calling all_plugins_play to load vars for managed_node3 30575 1726867611.97036: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867611.97039: Calling groups_plugins_play to load vars for managed_node3 30575 1726867611.98310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867611.99614: done with get_vars() 30575 1726867611.99630: done getting variables 30575 1726867611.99671: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:26:51 -0400 (0:00:00.055) 0:00:47.374 ****** 30575 1726867611.99693: entering _queue_task() for managed_node3/debug 30575 1726867611.99904: worker is 1 (out of 1 available) 30575 1726867611.99919: exiting _queue_task() for managed_node3/debug 30575 1726867611.99931: done queuing things up, now waiting for results queue to drain 30575 1726867611.99933: waiting for pending results... 30575 1726867612.00107: running TaskExecutor() for managed_node3/TASK: Show item 30575 1726867612.00182: in run() - task 0affcac9-a3a5-e081-a588-000000001006 30575 1726867612.00195: variable 'ansible_search_path' from source: unknown 30575 1726867612.00198: variable 'ansible_search_path' from source: unknown 30575 1726867612.00240: variable 'omit' from source: magic vars 30575 1726867612.00344: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.00351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.00360: variable 'omit' from source: magic vars 30575 1726867612.00622: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.00629: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.00635: variable 'omit' from source: magic vars 30575 1726867612.00658: variable 'omit' from source: magic vars 30575 1726867612.00687: variable 'item' from source: unknown 30575 1726867612.00738: variable 'item' from source: unknown 30575 1726867612.00751: variable 'omit' from source: magic vars 30575 1726867612.00782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867612.00813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.00827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867612.00840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.00851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.00874: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.00879: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.00881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.00950: Set connection var ansible_pipelining to False 30575 1726867612.00953: Set connection var ansible_shell_type to sh 30575 1726867612.00957: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.00963: Set connection var ansible_timeout to 10 30575 1726867612.00967: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.00974: Set connection var ansible_connection to ssh 30575 1726867612.00991: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.00994: variable 'ansible_connection' from source: unknown 30575 1726867612.00997: variable 'ansible_module_compression' from source: unknown 30575 1726867612.00999: variable 'ansible_shell_type' from source: unknown 30575 1726867612.01001: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.01003: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.01006: variable 'ansible_pipelining' from source: unknown 30575 1726867612.01008: variable 'ansible_timeout' from source: unknown 30575 1726867612.01013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.01110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.01121: variable 'omit' from source: magic vars 30575 1726867612.01124: starting attempt loop 30575 1726867612.01127: running the handler 30575 1726867612.01164: variable 'lsr_description' from source: include params 30575 1726867612.01209: variable 'lsr_description' from source: include params 30575 1726867612.01219: handler run complete 30575 1726867612.01230: attempt loop complete, returning result 30575 1726867612.01246: variable 'item' from source: unknown 30575 1726867612.01290: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 30575 1726867612.01431: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.01435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.01437: variable 'omit' from source: magic vars 30575 1726867612.01503: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.01507: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.01511: variable 'omit' from source: magic vars 30575 1726867612.01523: variable 'omit' from source: magic vars 30575 1726867612.01568: variable 'item' from source: unknown 30575 1726867612.01637: variable 'item' from source: unknown 30575 1726867612.01648: variable 'omit' from source: magic vars 30575 1726867612.01679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.01682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.01686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.01720: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.01723: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.01725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.01788: Set connection var ansible_pipelining to False 30575 1726867612.01791: Set connection var ansible_shell_type to sh 30575 1726867612.01795: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.01800: Set connection var ansible_timeout to 10 30575 1726867612.01805: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.01814: Set connection var ansible_connection to ssh 30575 1726867612.01832: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.01836: variable 'ansible_connection' from source: unknown 30575 1726867612.01839: variable 'ansible_module_compression' from source: unknown 30575 1726867612.01843: variable 'ansible_shell_type' from source: unknown 30575 1726867612.01845: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.01847: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.01870: variable 'ansible_pipelining' from source: unknown 30575 1726867612.01888: variable 'ansible_timeout' from source: unknown 30575 1726867612.01892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.01988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.01993: variable 'omit' from source: magic vars 30575 1726867612.01996: starting attempt loop 30575 1726867612.01998: running the handler 30575 1726867612.02058: variable 'lsr_setup' from source: include params 30575 1726867612.02111: variable 'lsr_setup' from source: include params 30575 1726867612.02138: handler run complete 30575 1726867612.02147: attempt loop complete, returning result 30575 1726867612.02167: variable 'item' from source: unknown 30575 1726867612.02281: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 30575 1726867612.02370: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.02374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.02455: variable 'omit' from source: magic vars 30575 1726867612.02509: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.02512: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.02517: variable 'omit' from source: magic vars 30575 1726867612.02531: variable 'omit' from source: magic vars 30575 1726867612.02560: variable 'item' from source: unknown 30575 1726867612.02603: variable 'item' from source: unknown 30575 1726867612.02613: variable 'omit' from source: magic vars 30575 1726867612.02629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.02637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.02640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.02649: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.02652: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.02654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.02717: Set connection var ansible_pipelining to False 30575 1726867612.02720: Set connection var ansible_shell_type to sh 30575 1726867612.02735: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.02738: Set connection var ansible_timeout to 10 30575 1726867612.02760: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.02763: Set connection var ansible_connection to ssh 30575 1726867612.02795: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.02799: variable 'ansible_connection' from source: unknown 30575 1726867612.02805: variable 'ansible_module_compression' from source: unknown 30575 1726867612.02807: variable 'ansible_shell_type' from source: unknown 30575 1726867612.02809: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.02811: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.02813: variable 'ansible_pipelining' from source: unknown 30575 1726867612.02815: variable 'ansible_timeout' from source: unknown 30575 1726867612.02817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.02948: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.02951: variable 'omit' from source: magic vars 30575 1726867612.02953: starting attempt loop 30575 1726867612.02956: running the handler 30575 1726867612.02958: variable 'lsr_test' from source: include params 30575 1726867612.03016: variable 'lsr_test' from source: include params 30575 1726867612.03030: handler run complete 30575 1726867612.03059: attempt loop complete, returning result 30575 1726867612.03062: variable 'item' from source: unknown 30575 1726867612.03152: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 30575 1726867612.03251: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.03254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.03257: variable 'omit' from source: magic vars 30575 1726867612.03456: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.03460: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.03499: variable 'omit' from source: magic vars 30575 1726867612.03506: variable 'omit' from source: magic vars 30575 1726867612.03508: variable 'item' from source: unknown 30575 1726867612.03544: variable 'item' from source: unknown 30575 1726867612.03547: variable 'omit' from source: magic vars 30575 1726867612.03560: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.03569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.03617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.03622: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.03625: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.03627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.03720: Set connection var ansible_pipelining to False 30575 1726867612.03724: Set connection var ansible_shell_type to sh 30575 1726867612.03727: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.03729: Set connection var ansible_timeout to 10 30575 1726867612.03731: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.03733: Set connection var ansible_connection to ssh 30575 1726867612.03803: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.03807: variable 'ansible_connection' from source: unknown 30575 1726867612.03809: variable 'ansible_module_compression' from source: unknown 30575 1726867612.03811: variable 'ansible_shell_type' from source: unknown 30575 1726867612.03814: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.03816: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.03853: variable 'ansible_pipelining' from source: unknown 30575 1726867612.03856: variable 'ansible_timeout' from source: unknown 30575 1726867612.03860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.03889: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.03892: variable 'omit' from source: magic vars 30575 1726867612.03914: starting attempt loop 30575 1726867612.03918: running the handler 30575 1726867612.03920: variable 'lsr_assert' from source: include params 30575 1726867612.04009: variable 'lsr_assert' from source: include params 30575 1726867612.04015: handler run complete 30575 1726867612.04069: attempt loop complete, returning result 30575 1726867612.04074: variable 'item' from source: unknown 30575 1726867612.04122: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 30575 1726867612.04221: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.04225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.04230: variable 'omit' from source: magic vars 30575 1726867612.04454: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.04460: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.04463: variable 'omit' from source: magic vars 30575 1726867612.04530: variable 'omit' from source: magic vars 30575 1726867612.04533: variable 'item' from source: unknown 30575 1726867612.04574: variable 'item' from source: unknown 30575 1726867612.04608: variable 'omit' from source: magic vars 30575 1726867612.04612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.04614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.04617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.04634: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.04639: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.04642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.04679: Set connection var ansible_pipelining to False 30575 1726867612.04683: Set connection var ansible_shell_type to sh 30575 1726867612.04687: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.04692: Set connection var ansible_timeout to 10 30575 1726867612.04696: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.04702: Set connection var ansible_connection to ssh 30575 1726867612.04744: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.04747: variable 'ansible_connection' from source: unknown 30575 1726867612.04749: variable 'ansible_module_compression' from source: unknown 30575 1726867612.04751: variable 'ansible_shell_type' from source: unknown 30575 1726867612.04754: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.04756: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.04758: variable 'ansible_pipelining' from source: unknown 30575 1726867612.04760: variable 'ansible_timeout' from source: unknown 30575 1726867612.04761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.04809: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.04814: variable 'omit' from source: magic vars 30575 1726867612.04817: starting attempt loop 30575 1726867612.04827: running the handler 30575 1726867612.04975: handler run complete 30575 1726867612.05003: attempt loop complete, returning result 30575 1726867612.05006: variable 'item' from source: unknown 30575 1726867612.05057: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30575 1726867612.05155: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.05159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.05161: variable 'omit' from source: magic vars 30575 1726867612.05346: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.05349: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.05352: variable 'omit' from source: magic vars 30575 1726867612.05354: variable 'omit' from source: magic vars 30575 1726867612.05384: variable 'item' from source: unknown 30575 1726867612.05425: variable 'item' from source: unknown 30575 1726867612.05454: variable 'omit' from source: magic vars 30575 1726867612.05465: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.05472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.05503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.05508: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.05510: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.05512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.05574: Set connection var ansible_pipelining to False 30575 1726867612.05595: Set connection var ansible_shell_type to sh 30575 1726867612.05598: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.05601: Set connection var ansible_timeout to 10 30575 1726867612.05603: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.05605: Set connection var ansible_connection to ssh 30575 1726867612.05616: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.05618: variable 'ansible_connection' from source: unknown 30575 1726867612.05623: variable 'ansible_module_compression' from source: unknown 30575 1726867612.05626: variable 'ansible_shell_type' from source: unknown 30575 1726867612.05628: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.05630: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.05634: variable 'ansible_pipelining' from source: unknown 30575 1726867612.05637: variable 'ansible_timeout' from source: unknown 30575 1726867612.05649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.05714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.05722: variable 'omit' from source: magic vars 30575 1726867612.05725: starting attempt loop 30575 1726867612.05727: running the handler 30575 1726867612.05755: variable 'lsr_fail_debug' from source: play vars 30575 1726867612.05825: variable 'lsr_fail_debug' from source: play vars 30575 1726867612.05839: handler run complete 30575 1726867612.05851: attempt loop complete, returning result 30575 1726867612.05861: variable 'item' from source: unknown 30575 1726867612.05906: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30575 1726867612.06033: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.06036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.06039: variable 'omit' from source: magic vars 30575 1726867612.06125: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.06128: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.06132: variable 'omit' from source: magic vars 30575 1726867612.06144: variable 'omit' from source: magic vars 30575 1726867612.06169: variable 'item' from source: unknown 30575 1726867612.06217: variable 'item' from source: unknown 30575 1726867612.06237: variable 'omit' from source: magic vars 30575 1726867612.06266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.06269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.06274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.06279: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.06282: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.06284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.06345: Set connection var ansible_pipelining to False 30575 1726867612.06348: Set connection var ansible_shell_type to sh 30575 1726867612.06351: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.06356: Set connection var ansible_timeout to 10 30575 1726867612.06384: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.06401: Set connection var ansible_connection to ssh 30575 1726867612.06404: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.06407: variable 'ansible_connection' from source: unknown 30575 1726867612.06409: variable 'ansible_module_compression' from source: unknown 30575 1726867612.06411: variable 'ansible_shell_type' from source: unknown 30575 1726867612.06413: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.06415: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.06417: variable 'ansible_pipelining' from source: unknown 30575 1726867612.06454: variable 'ansible_timeout' from source: unknown 30575 1726867612.06457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.06495: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.06502: variable 'omit' from source: magic vars 30575 1726867612.06504: starting attempt loop 30575 1726867612.06507: running the handler 30575 1726867612.06524: variable 'lsr_cleanup' from source: include params 30575 1726867612.06568: variable 'lsr_cleanup' from source: include params 30575 1726867612.06608: handler run complete 30575 1726867612.06611: attempt loop complete, returning result 30575 1726867612.06615: variable 'item' from source: unknown 30575 1726867612.06698: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30575 1726867612.06776: dumping result to json 30575 1726867612.06793: done dumping result, returning 30575 1726867612.06795: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-e081-a588-000000001006] 30575 1726867612.06797: sending task result for task 0affcac9-a3a5-e081-a588-000000001006 30575 1726867612.06835: done sending task result for task 0affcac9-a3a5-e081-a588-000000001006 30575 1726867612.06838: WORKER PROCESS EXITING 30575 1726867612.06940: no more pending results, returning what we have 30575 1726867612.06944: results queue empty 30575 1726867612.06945: checking for any_errors_fatal 30575 1726867612.06952: done checking for any_errors_fatal 30575 1726867612.06953: checking for max_fail_percentage 30575 1726867612.06954: done checking for max_fail_percentage 30575 1726867612.06955: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.06956: done checking to see if all hosts have failed 30575 1726867612.06956: getting the remaining hosts for this loop 30575 1726867612.06958: done getting the remaining hosts for this loop 30575 1726867612.06961: getting the next task for host managed_node3 30575 1726867612.06967: done getting next task for host managed_node3 30575 1726867612.06969: ^ task is: TASK: Include the task 'show_interfaces.yml' 30575 1726867612.06972: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.06975: getting variables 30575 1726867612.06976: in VariableManager get_vars() 30575 1726867612.07015: Calling all_inventory to load vars for managed_node3 30575 1726867612.07017: Calling groups_inventory to load vars for managed_node3 30575 1726867612.07020: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.07030: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.07032: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.07034: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.07929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.08801: done with get_vars() 30575 1726867612.08816: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:26:52 -0400 (0:00:00.091) 0:00:47.466 ****** 30575 1726867612.08882: entering _queue_task() for managed_node3/include_tasks 30575 1726867612.09115: worker is 1 (out of 1 available) 30575 1726867612.09129: exiting _queue_task() for managed_node3/include_tasks 30575 1726867612.09142: done queuing things up, now waiting for results queue to drain 30575 1726867612.09143: waiting for pending results... 30575 1726867612.09382: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30575 1726867612.09530: in run() - task 0affcac9-a3a5-e081-a588-000000001007 30575 1726867612.09534: variable 'ansible_search_path' from source: unknown 30575 1726867612.09539: variable 'ansible_search_path' from source: unknown 30575 1726867612.09565: calling self._execute() 30575 1726867612.09651: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.09686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.09692: variable 'omit' from source: magic vars 30575 1726867612.10093: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.10097: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.10105: _execute() done 30575 1726867612.10108: dumping result to json 30575 1726867612.10111: done dumping result, returning 30575 1726867612.10113: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-e081-a588-000000001007] 30575 1726867612.10115: sending task result for task 0affcac9-a3a5-e081-a588-000000001007 30575 1726867612.10254: done sending task result for task 0affcac9-a3a5-e081-a588-000000001007 30575 1726867612.10257: WORKER PROCESS EXITING 30575 1726867612.10321: no more pending results, returning what we have 30575 1726867612.10327: in VariableManager get_vars() 30575 1726867612.10359: Calling all_inventory to load vars for managed_node3 30575 1726867612.10363: Calling groups_inventory to load vars for managed_node3 30575 1726867612.10368: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.10426: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.10430: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.10433: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.11522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.12542: done with get_vars() 30575 1726867612.12556: variable 'ansible_search_path' from source: unknown 30575 1726867612.12557: variable 'ansible_search_path' from source: unknown 30575 1726867612.12584: we have included files to process 30575 1726867612.12584: generating all_blocks data 30575 1726867612.12586: done generating all_blocks data 30575 1726867612.12590: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867612.12590: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867612.12592: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867612.12659: in VariableManager get_vars() 30575 1726867612.12672: done with get_vars() 30575 1726867612.12763: done processing included file 30575 1726867612.12764: iterating over new_blocks loaded from include file 30575 1726867612.12765: in VariableManager get_vars() 30575 1726867612.12775: done with get_vars() 30575 1726867612.12776: filtering new block on tags 30575 1726867612.12809: done filtering new block on tags 30575 1726867612.12811: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30575 1726867612.12815: extending task lists for all hosts with included blocks 30575 1726867612.13085: done extending task lists 30575 1726867612.13086: done processing included files 30575 1726867612.13086: results queue empty 30575 1726867612.13087: checking for any_errors_fatal 30575 1726867612.13091: done checking for any_errors_fatal 30575 1726867612.13091: checking for max_fail_percentage 30575 1726867612.13092: done checking for max_fail_percentage 30575 1726867612.13093: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.13094: done checking to see if all hosts have failed 30575 1726867612.13095: getting the remaining hosts for this loop 30575 1726867612.13096: done getting the remaining hosts for this loop 30575 1726867612.13097: getting the next task for host managed_node3 30575 1726867612.13100: done getting next task for host managed_node3 30575 1726867612.13101: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30575 1726867612.13103: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.13105: getting variables 30575 1726867612.13105: in VariableManager get_vars() 30575 1726867612.13112: Calling all_inventory to load vars for managed_node3 30575 1726867612.13113: Calling groups_inventory to load vars for managed_node3 30575 1726867612.13115: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.13120: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.13122: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.13123: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.14023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.14894: done with get_vars() 30575 1726867612.14908: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:26:52 -0400 (0:00:00.060) 0:00:47.527 ****** 30575 1726867612.14973: entering _queue_task() for managed_node3/include_tasks 30575 1726867612.15242: worker is 1 (out of 1 available) 30575 1726867612.15254: exiting _queue_task() for managed_node3/include_tasks 30575 1726867612.15272: done queuing things up, now waiting for results queue to drain 30575 1726867612.15274: waiting for pending results... 30575 1726867612.15686: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30575 1726867612.15692: in run() - task 0affcac9-a3a5-e081-a588-00000000102e 30575 1726867612.15694: variable 'ansible_search_path' from source: unknown 30575 1726867612.15697: variable 'ansible_search_path' from source: unknown 30575 1726867612.15814: calling self._execute() 30575 1726867612.15905: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.15999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.16005: variable 'omit' from source: magic vars 30575 1726867612.16656: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.16664: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.16674: _execute() done 30575 1726867612.16702: dumping result to json 30575 1726867612.16705: done dumping result, returning 30575 1726867612.16708: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-e081-a588-00000000102e] 30575 1726867612.16709: sending task result for task 0affcac9-a3a5-e081-a588-00000000102e 30575 1726867612.16818: done sending task result for task 0affcac9-a3a5-e081-a588-00000000102e 30575 1726867612.16826: WORKER PROCESS EXITING 30575 1726867612.16888: no more pending results, returning what we have 30575 1726867612.16893: in VariableManager get_vars() 30575 1726867612.16926: Calling all_inventory to load vars for managed_node3 30575 1726867612.16928: Calling groups_inventory to load vars for managed_node3 30575 1726867612.16932: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.16946: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.16950: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.16953: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.17898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.19054: done with get_vars() 30575 1726867612.19072: variable 'ansible_search_path' from source: unknown 30575 1726867612.19073: variable 'ansible_search_path' from source: unknown 30575 1726867612.19108: we have included files to process 30575 1726867612.19110: generating all_blocks data 30575 1726867612.19111: done generating all_blocks data 30575 1726867612.19113: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867612.19113: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867612.19118: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867612.19372: done processing included file 30575 1726867612.19374: iterating over new_blocks loaded from include file 30575 1726867612.19375: in VariableManager get_vars() 30575 1726867612.19392: done with get_vars() 30575 1726867612.19393: filtering new block on tags 30575 1726867612.19433: done filtering new block on tags 30575 1726867612.19435: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30575 1726867612.19440: extending task lists for all hosts with included blocks 30575 1726867612.19601: done extending task lists 30575 1726867612.19603: done processing included files 30575 1726867612.19603: results queue empty 30575 1726867612.19604: checking for any_errors_fatal 30575 1726867612.19607: done checking for any_errors_fatal 30575 1726867612.19608: checking for max_fail_percentage 30575 1726867612.19609: done checking for max_fail_percentage 30575 1726867612.19609: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.19610: done checking to see if all hosts have failed 30575 1726867612.19611: getting the remaining hosts for this loop 30575 1726867612.19612: done getting the remaining hosts for this loop 30575 1726867612.19615: getting the next task for host managed_node3 30575 1726867612.19622: done getting next task for host managed_node3 30575 1726867612.19624: ^ task is: TASK: Gather current interface info 30575 1726867612.19627: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.19630: getting variables 30575 1726867612.19631: in VariableManager get_vars() 30575 1726867612.19640: Calling all_inventory to load vars for managed_node3 30575 1726867612.19642: Calling groups_inventory to load vars for managed_node3 30575 1726867612.19644: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.19649: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.19652: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.19655: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.20828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.21667: done with get_vars() 30575 1726867612.21683: done getting variables 30575 1726867612.21712: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:26:52 -0400 (0:00:00.067) 0:00:47.594 ****** 30575 1726867612.21736: entering _queue_task() for managed_node3/command 30575 1726867612.21972: worker is 1 (out of 1 available) 30575 1726867612.21987: exiting _queue_task() for managed_node3/command 30575 1726867612.22001: done queuing things up, now waiting for results queue to drain 30575 1726867612.22003: waiting for pending results... 30575 1726867612.22186: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30575 1726867612.22261: in run() - task 0affcac9-a3a5-e081-a588-000000001069 30575 1726867612.22274: variable 'ansible_search_path' from source: unknown 30575 1726867612.22279: variable 'ansible_search_path' from source: unknown 30575 1726867612.22306: calling self._execute() 30575 1726867612.22395: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.22400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.22403: variable 'omit' from source: magic vars 30575 1726867612.22888: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.22892: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.22895: variable 'omit' from source: magic vars 30575 1726867612.22897: variable 'omit' from source: magic vars 30575 1726867612.22903: variable 'omit' from source: magic vars 30575 1726867612.22948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867612.22990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.23023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867612.23043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.23058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.23094: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.23222: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.23226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.23243: Set connection var ansible_pipelining to False 30575 1726867612.23259: Set connection var ansible_shell_type to sh 30575 1726867612.23276: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.23286: Set connection var ansible_timeout to 10 30575 1726867612.23293: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.23333: Set connection var ansible_connection to ssh 30575 1726867612.23340: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.23343: variable 'ansible_connection' from source: unknown 30575 1726867612.23346: variable 'ansible_module_compression' from source: unknown 30575 1726867612.23349: variable 'ansible_shell_type' from source: unknown 30575 1726867612.23351: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.23353: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.23358: variable 'ansible_pipelining' from source: unknown 30575 1726867612.23360: variable 'ansible_timeout' from source: unknown 30575 1726867612.23363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.23472: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.23482: variable 'omit' from source: magic vars 30575 1726867612.23488: starting attempt loop 30575 1726867612.23490: running the handler 30575 1726867612.23503: _low_level_execute_command(): starting 30575 1726867612.23509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867612.23981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867612.24010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867612.24015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867612.24057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867612.24072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867612.24129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867612.25831: stdout chunk (state=3): >>>/root <<< 30575 1726867612.25976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867612.25982: stdout chunk (state=3): >>><<< 30575 1726867612.25984: stderr chunk (state=3): >>><<< 30575 1726867612.26004: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867612.26025: _low_level_execute_command(): starting 30575 1726867612.26082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207 `" && echo ansible-tmp-1726867612.2601202-32928-83025219140207="` echo /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207 `" ) && sleep 0' 30575 1726867612.26493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867612.26496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867612.26500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867612.26509: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867612.26511: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867612.26553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867612.26560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867612.26605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867612.28493: stdout chunk (state=3): >>>ansible-tmp-1726867612.2601202-32928-83025219140207=/root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207 <<< 30575 1726867612.28642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867612.28646: stdout chunk (state=3): >>><<< 30575 1726867612.28649: stderr chunk (state=3): >>><<< 30575 1726867612.28883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867612.2601202-32928-83025219140207=/root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867612.28887: variable 'ansible_module_compression' from source: unknown 30575 1726867612.28890: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867612.28892: variable 'ansible_facts' from source: unknown 30575 1726867612.28894: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/AnsiballZ_command.py 30575 1726867612.29072: Sending initial data 30575 1726867612.29085: Sent initial data (155 bytes) 30575 1726867612.29532: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867612.29548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867612.29584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867612.31132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867612.31203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867612.31300: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9yswjulr /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/AnsiballZ_command.py <<< 30575 1726867612.31304: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/AnsiballZ_command.py" <<< 30575 1726867612.31339: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9yswjulr" to remote "/root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/AnsiballZ_command.py" <<< 30575 1726867612.31885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867612.31920: stderr chunk (state=3): >>><<< 30575 1726867612.31923: stdout chunk (state=3): >>><<< 30575 1726867612.31944: done transferring module to remote 30575 1726867612.31957: _low_level_execute_command(): starting 30575 1726867612.31960: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/ /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/AnsiballZ_command.py && sleep 0' 30575 1726867612.32357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867612.32360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867612.32362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867612.32365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867612.32366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867612.32369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867612.32416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867612.32423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867612.32472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867612.34206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867612.34232: stderr chunk (state=3): >>><<< 30575 1726867612.34236: stdout chunk (state=3): >>><<< 30575 1726867612.34246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867612.34249: _low_level_execute_command(): starting 30575 1726867612.34253: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/AnsiballZ_command.py && sleep 0' 30575 1726867612.34623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867612.34627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867612.34639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867612.34688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867612.34702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867612.34749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867612.50268: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:52.497200", "end": "2024-09-20 17:26:52.500460", "delta": "0:00:00.003260", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867612.51905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867612.51909: stdout chunk (state=3): >>><<< 30575 1726867612.51912: stderr chunk (state=3): >>><<< 30575 1726867612.51995: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:26:52.497200", "end": "2024-09-20 17:26:52.500460", "delta": "0:00:00.003260", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867612.52000: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867612.52002: _low_level_execute_command(): starting 30575 1726867612.52005: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867612.2601202-32928-83025219140207/ > /dev/null 2>&1 && sleep 0' 30575 1726867612.52599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867612.52665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867612.52669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867612.52692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867612.52725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867612.52779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867612.52786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867612.52828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867612.54646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867612.54674: stderr chunk (state=3): >>><<< 30575 1726867612.54676: stdout chunk (state=3): >>><<< 30575 1726867612.54691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867612.54697: handler run complete 30575 1726867612.54714: Evaluated conditional (False): False 30575 1726867612.54724: attempt loop complete, returning result 30575 1726867612.54726: _execute() done 30575 1726867612.54729: dumping result to json 30575 1726867612.54736: done dumping result, returning 30575 1726867612.54744: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-e081-a588-000000001069] 30575 1726867612.54748: sending task result for task 0affcac9-a3a5-e081-a588-000000001069 30575 1726867612.54846: done sending task result for task 0affcac9-a3a5-e081-a588-000000001069 30575 1726867612.54849: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003260", "end": "2024-09-20 17:26:52.500460", "rc": 0, "start": "2024-09-20 17:26:52.497200" } STDOUT: bonding_masters eth0 lo 30575 1726867612.54925: no more pending results, returning what we have 30575 1726867612.54930: results queue empty 30575 1726867612.54930: checking for any_errors_fatal 30575 1726867612.54932: done checking for any_errors_fatal 30575 1726867612.54933: checking for max_fail_percentage 30575 1726867612.54934: done checking for max_fail_percentage 30575 1726867612.54935: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.54936: done checking to see if all hosts have failed 30575 1726867612.54937: getting the remaining hosts for this loop 30575 1726867612.54938: done getting the remaining hosts for this loop 30575 1726867612.54942: getting the next task for host managed_node3 30575 1726867612.54951: done getting next task for host managed_node3 30575 1726867612.54954: ^ task is: TASK: Set current_interfaces 30575 1726867612.54960: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.54965: getting variables 30575 1726867612.54967: in VariableManager get_vars() 30575 1726867612.55003: Calling all_inventory to load vars for managed_node3 30575 1726867612.55006: Calling groups_inventory to load vars for managed_node3 30575 1726867612.55009: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.55022: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.55024: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.55027: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.55934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.57273: done with get_vars() 30575 1726867612.57293: done getting variables 30575 1726867612.57337: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:26:52 -0400 (0:00:00.356) 0:00:47.951 ****** 30575 1726867612.57360: entering _queue_task() for managed_node3/set_fact 30575 1726867612.57590: worker is 1 (out of 1 available) 30575 1726867612.57604: exiting _queue_task() for managed_node3/set_fact 30575 1726867612.57618: done queuing things up, now waiting for results queue to drain 30575 1726867612.57620: waiting for pending results... 30575 1726867612.57803: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30575 1726867612.57888: in run() - task 0affcac9-a3a5-e081-a588-00000000106a 30575 1726867612.57899: variable 'ansible_search_path' from source: unknown 30575 1726867612.57903: variable 'ansible_search_path' from source: unknown 30575 1726867612.57937: calling self._execute() 30575 1726867612.58006: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.58011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.58020: variable 'omit' from source: magic vars 30575 1726867612.58295: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.58305: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.58311: variable 'omit' from source: magic vars 30575 1726867612.58349: variable 'omit' from source: magic vars 30575 1726867612.58428: variable '_current_interfaces' from source: set_fact 30575 1726867612.58476: variable 'omit' from source: magic vars 30575 1726867612.58511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867612.58540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.58557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867612.58571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.58582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.58611: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.58614: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.58616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.58701: Set connection var ansible_pipelining to False 30575 1726867612.58706: Set connection var ansible_shell_type to sh 30575 1726867612.58709: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.58782: Set connection var ansible_timeout to 10 30575 1726867612.58785: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.58787: Set connection var ansible_connection to ssh 30575 1726867612.58789: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.58792: variable 'ansible_connection' from source: unknown 30575 1726867612.58794: variable 'ansible_module_compression' from source: unknown 30575 1726867612.58796: variable 'ansible_shell_type' from source: unknown 30575 1726867612.58798: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.58800: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.58802: variable 'ansible_pipelining' from source: unknown 30575 1726867612.58804: variable 'ansible_timeout' from source: unknown 30575 1726867612.58805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.58944: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.58964: variable 'omit' from source: magic vars 30575 1726867612.58975: starting attempt loop 30575 1726867612.58984: running the handler 30575 1726867612.58998: handler run complete 30575 1726867612.59010: attempt loop complete, returning result 30575 1726867612.59018: _execute() done 30575 1726867612.59024: dumping result to json 30575 1726867612.59183: done dumping result, returning 30575 1726867612.59186: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-e081-a588-00000000106a] 30575 1726867612.59188: sending task result for task 0affcac9-a3a5-e081-a588-00000000106a 30575 1726867612.59249: done sending task result for task 0affcac9-a3a5-e081-a588-00000000106a 30575 1726867612.59253: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30575 1726867612.59316: no more pending results, returning what we have 30575 1726867612.59320: results queue empty 30575 1726867612.59321: checking for any_errors_fatal 30575 1726867612.59332: done checking for any_errors_fatal 30575 1726867612.59333: checking for max_fail_percentage 30575 1726867612.59334: done checking for max_fail_percentage 30575 1726867612.59335: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.59336: done checking to see if all hosts have failed 30575 1726867612.59337: getting the remaining hosts for this loop 30575 1726867612.59338: done getting the remaining hosts for this loop 30575 1726867612.59343: getting the next task for host managed_node3 30575 1726867612.59353: done getting next task for host managed_node3 30575 1726867612.59356: ^ task is: TASK: Show current_interfaces 30575 1726867612.59360: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.59366: getting variables 30575 1726867612.59367: in VariableManager get_vars() 30575 1726867612.59406: Calling all_inventory to load vars for managed_node3 30575 1726867612.59409: Calling groups_inventory to load vars for managed_node3 30575 1726867612.59413: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.59425: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.59428: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.59432: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.60539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.61380: done with get_vars() 30575 1726867612.61396: done getting variables 30575 1726867612.61438: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:26:52 -0400 (0:00:00.040) 0:00:47.992 ****** 30575 1726867612.61461: entering _queue_task() for managed_node3/debug 30575 1726867612.61673: worker is 1 (out of 1 available) 30575 1726867612.61689: exiting _queue_task() for managed_node3/debug 30575 1726867612.61702: done queuing things up, now waiting for results queue to drain 30575 1726867612.61704: waiting for pending results... 30575 1726867612.61949: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30575 1726867612.62067: in run() - task 0affcac9-a3a5-e081-a588-00000000102f 30575 1726867612.62090: variable 'ansible_search_path' from source: unknown 30575 1726867612.62100: variable 'ansible_search_path' from source: unknown 30575 1726867612.62150: calling self._execute() 30575 1726867612.62244: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.62444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.62450: variable 'omit' from source: magic vars 30575 1726867612.62672: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.62676: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.62681: variable 'omit' from source: magic vars 30575 1726867612.62731: variable 'omit' from source: magic vars 30575 1726867612.62836: variable 'current_interfaces' from source: set_fact 30575 1726867612.62840: variable 'omit' from source: magic vars 30575 1726867612.62875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867612.62914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867612.62932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867612.62950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.62983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867612.62991: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867612.63039: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.63043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.63101: Set connection var ansible_pipelining to False 30575 1726867612.63104: Set connection var ansible_shell_type to sh 30575 1726867612.63110: Set connection var ansible_shell_executable to /bin/sh 30575 1726867612.63116: Set connection var ansible_timeout to 10 30575 1726867612.63124: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867612.63131: Set connection var ansible_connection to ssh 30575 1726867612.63160: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.63164: variable 'ansible_connection' from source: unknown 30575 1726867612.63166: variable 'ansible_module_compression' from source: unknown 30575 1726867612.63170: variable 'ansible_shell_type' from source: unknown 30575 1726867612.63172: variable 'ansible_shell_executable' from source: unknown 30575 1726867612.63175: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.63178: variable 'ansible_pipelining' from source: unknown 30575 1726867612.63180: variable 'ansible_timeout' from source: unknown 30575 1726867612.63182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.63367: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867612.63372: variable 'omit' from source: magic vars 30575 1726867612.63375: starting attempt loop 30575 1726867612.63379: running the handler 30575 1726867612.63382: handler run complete 30575 1726867612.63583: attempt loop complete, returning result 30575 1726867612.63587: _execute() done 30575 1726867612.63589: dumping result to json 30575 1726867612.63592: done dumping result, returning 30575 1726867612.63595: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-e081-a588-00000000102f] 30575 1726867612.63597: sending task result for task 0affcac9-a3a5-e081-a588-00000000102f 30575 1726867612.63659: done sending task result for task 0affcac9-a3a5-e081-a588-00000000102f 30575 1726867612.63663: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30575 1726867612.63711: no more pending results, returning what we have 30575 1726867612.63714: results queue empty 30575 1726867612.63715: checking for any_errors_fatal 30575 1726867612.63723: done checking for any_errors_fatal 30575 1726867612.63724: checking for max_fail_percentage 30575 1726867612.63726: done checking for max_fail_percentage 30575 1726867612.63727: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.63727: done checking to see if all hosts have failed 30575 1726867612.63728: getting the remaining hosts for this loop 30575 1726867612.63730: done getting the remaining hosts for this loop 30575 1726867612.63734: getting the next task for host managed_node3 30575 1726867612.63743: done getting next task for host managed_node3 30575 1726867612.63746: ^ task is: TASK: Setup 30575 1726867612.63749: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.63753: getting variables 30575 1726867612.63755: in VariableManager get_vars() 30575 1726867612.63793: Calling all_inventory to load vars for managed_node3 30575 1726867612.63800: Calling groups_inventory to load vars for managed_node3 30575 1726867612.63804: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.63815: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.63825: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.63829: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.68721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.69565: done with get_vars() 30575 1726867612.69586: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:26:52 -0400 (0:00:00.081) 0:00:48.073 ****** 30575 1726867612.69642: entering _queue_task() for managed_node3/include_tasks 30575 1726867612.69914: worker is 1 (out of 1 available) 30575 1726867612.69931: exiting _queue_task() for managed_node3/include_tasks 30575 1726867612.69944: done queuing things up, now waiting for results queue to drain 30575 1726867612.69946: waiting for pending results... 30575 1726867612.70129: running TaskExecutor() for managed_node3/TASK: Setup 30575 1726867612.70210: in run() - task 0affcac9-a3a5-e081-a588-000000001008 30575 1726867612.70223: variable 'ansible_search_path' from source: unknown 30575 1726867612.70227: variable 'ansible_search_path' from source: unknown 30575 1726867612.70263: variable 'lsr_setup' from source: include params 30575 1726867612.70430: variable 'lsr_setup' from source: include params 30575 1726867612.70486: variable 'omit' from source: magic vars 30575 1726867612.70584: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.70592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.70601: variable 'omit' from source: magic vars 30575 1726867612.70772: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.70780: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.70787: variable 'item' from source: unknown 30575 1726867612.70835: variable 'item' from source: unknown 30575 1726867612.70858: variable 'item' from source: unknown 30575 1726867612.70908: variable 'item' from source: unknown 30575 1726867612.71035: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.71038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.71040: variable 'omit' from source: magic vars 30575 1726867612.71117: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.71124: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.71129: variable 'item' from source: unknown 30575 1726867612.71175: variable 'item' from source: unknown 30575 1726867612.71196: variable 'item' from source: unknown 30575 1726867612.71239: variable 'item' from source: unknown 30575 1726867612.71303: dumping result to json 30575 1726867612.71307: done dumping result, returning 30575 1726867612.71309: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-e081-a588-000000001008] 30575 1726867612.71311: sending task result for task 0affcac9-a3a5-e081-a588-000000001008 30575 1726867612.71344: done sending task result for task 0affcac9-a3a5-e081-a588-000000001008 30575 1726867612.71347: WORKER PROCESS EXITING 30575 1726867612.71374: no more pending results, returning what we have 30575 1726867612.71381: in VariableManager get_vars() 30575 1726867612.71419: Calling all_inventory to load vars for managed_node3 30575 1726867612.71421: Calling groups_inventory to load vars for managed_node3 30575 1726867612.71424: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.71437: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.71440: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.71442: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.72223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.73074: done with get_vars() 30575 1726867612.73089: variable 'ansible_search_path' from source: unknown 30575 1726867612.73090: variable 'ansible_search_path' from source: unknown 30575 1726867612.73118: variable 'ansible_search_path' from source: unknown 30575 1726867612.73119: variable 'ansible_search_path' from source: unknown 30575 1726867612.73136: we have included files to process 30575 1726867612.73137: generating all_blocks data 30575 1726867612.73138: done generating all_blocks data 30575 1726867612.73141: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867612.73141: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867612.73143: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867612.73290: done processing included file 30575 1726867612.73291: iterating over new_blocks loaded from include file 30575 1726867612.73292: in VariableManager get_vars() 30575 1726867612.73301: done with get_vars() 30575 1726867612.73302: filtering new block on tags 30575 1726867612.73326: done filtering new block on tags 30575 1726867612.73328: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30575 1726867612.73331: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867612.73332: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867612.73334: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867612.73389: done processing included file 30575 1726867612.73391: iterating over new_blocks loaded from include file 30575 1726867612.73391: in VariableManager get_vars() 30575 1726867612.73401: done with get_vars() 30575 1726867612.73402: filtering new block on tags 30575 1726867612.73414: done filtering new block on tags 30575 1726867612.73415: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30575 1726867612.73418: extending task lists for all hosts with included blocks 30575 1726867612.73748: done extending task lists 30575 1726867612.73750: done processing included files 30575 1726867612.73751: results queue empty 30575 1726867612.73751: checking for any_errors_fatal 30575 1726867612.73755: done checking for any_errors_fatal 30575 1726867612.73756: checking for max_fail_percentage 30575 1726867612.73756: done checking for max_fail_percentage 30575 1726867612.73757: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.73757: done checking to see if all hosts have failed 30575 1726867612.73758: getting the remaining hosts for this loop 30575 1726867612.73759: done getting the remaining hosts for this loop 30575 1726867612.73760: getting the next task for host managed_node3 30575 1726867612.73763: done getting next task for host managed_node3 30575 1726867612.73764: ^ task is: TASK: Include network role 30575 1726867612.73766: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.73768: getting variables 30575 1726867612.73768: in VariableManager get_vars() 30575 1726867612.73775: Calling all_inventory to load vars for managed_node3 30575 1726867612.73782: Calling groups_inventory to load vars for managed_node3 30575 1726867612.73784: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.73787: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.73789: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.73790: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.74473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.75321: done with get_vars() 30575 1726867612.75335: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 17:26:52 -0400 (0:00:00.057) 0:00:48.131 ****** 30575 1726867612.75387: entering _queue_task() for managed_node3/include_role 30575 1726867612.75624: worker is 1 (out of 1 available) 30575 1726867612.75637: exiting _queue_task() for managed_node3/include_role 30575 1726867612.75650: done queuing things up, now waiting for results queue to drain 30575 1726867612.75651: waiting for pending results... 30575 1726867612.75833: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867612.75912: in run() - task 0affcac9-a3a5-e081-a588-00000000108f 30575 1726867612.75925: variable 'ansible_search_path' from source: unknown 30575 1726867612.75928: variable 'ansible_search_path' from source: unknown 30575 1726867612.75956: calling self._execute() 30575 1726867612.76030: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.76034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.76043: variable 'omit' from source: magic vars 30575 1726867612.76310: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.76318: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.76330: _execute() done 30575 1726867612.76333: dumping result to json 30575 1726867612.76336: done dumping result, returning 30575 1726867612.76342: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-00000000108f] 30575 1726867612.76348: sending task result for task 0affcac9-a3a5-e081-a588-00000000108f 30575 1726867612.76452: done sending task result for task 0affcac9-a3a5-e081-a588-00000000108f 30575 1726867612.76454: WORKER PROCESS EXITING 30575 1726867612.76481: no more pending results, returning what we have 30575 1726867612.76485: in VariableManager get_vars() 30575 1726867612.76524: Calling all_inventory to load vars for managed_node3 30575 1726867612.76526: Calling groups_inventory to load vars for managed_node3 30575 1726867612.76530: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.76540: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.76543: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.76546: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.77312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.78281: done with get_vars() 30575 1726867612.78296: variable 'ansible_search_path' from source: unknown 30575 1726867612.78297: variable 'ansible_search_path' from source: unknown 30575 1726867612.78408: variable 'omit' from source: magic vars 30575 1726867612.78435: variable 'omit' from source: magic vars 30575 1726867612.78444: variable 'omit' from source: magic vars 30575 1726867612.78447: we have included files to process 30575 1726867612.78447: generating all_blocks data 30575 1726867612.78448: done generating all_blocks data 30575 1726867612.78449: processing included file: fedora.linux_system_roles.network 30575 1726867612.78462: in VariableManager get_vars() 30575 1726867612.78471: done with get_vars() 30575 1726867612.78489: in VariableManager get_vars() 30575 1726867612.78500: done with get_vars() 30575 1726867612.78530: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867612.78597: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867612.78648: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867612.78909: in VariableManager get_vars() 30575 1726867612.78924: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867612.80153: iterating over new_blocks loaded from include file 30575 1726867612.80155: in VariableManager get_vars() 30575 1726867612.80165: done with get_vars() 30575 1726867612.80166: filtering new block on tags 30575 1726867612.80325: done filtering new block on tags 30575 1726867612.80328: in VariableManager get_vars() 30575 1726867612.80337: done with get_vars() 30575 1726867612.80338: filtering new block on tags 30575 1726867612.80348: done filtering new block on tags 30575 1726867612.80349: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867612.80353: extending task lists for all hosts with included blocks 30575 1726867612.80446: done extending task lists 30575 1726867612.80447: done processing included files 30575 1726867612.80447: results queue empty 30575 1726867612.80448: checking for any_errors_fatal 30575 1726867612.80450: done checking for any_errors_fatal 30575 1726867612.80450: checking for max_fail_percentage 30575 1726867612.80451: done checking for max_fail_percentage 30575 1726867612.80452: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.80452: done checking to see if all hosts have failed 30575 1726867612.80453: getting the remaining hosts for this loop 30575 1726867612.80453: done getting the remaining hosts for this loop 30575 1726867612.80455: getting the next task for host managed_node3 30575 1726867612.80458: done getting next task for host managed_node3 30575 1726867612.80460: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867612.80462: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.80471: getting variables 30575 1726867612.80472: in VariableManager get_vars() 30575 1726867612.80481: Calling all_inventory to load vars for managed_node3 30575 1726867612.80483: Calling groups_inventory to load vars for managed_node3 30575 1726867612.80484: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.80488: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.80489: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.80491: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.81113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.81962: done with get_vars() 30575 1726867612.81975: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:26:52 -0400 (0:00:00.066) 0:00:48.197 ****** 30575 1726867612.82027: entering _queue_task() for managed_node3/include_tasks 30575 1726867612.82278: worker is 1 (out of 1 available) 30575 1726867612.82290: exiting _queue_task() for managed_node3/include_tasks 30575 1726867612.82302: done queuing things up, now waiting for results queue to drain 30575 1726867612.82304: waiting for pending results... 30575 1726867612.82486: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867612.82564: in run() - task 0affcac9-a3a5-e081-a588-0000000010f5 30575 1726867612.82575: variable 'ansible_search_path' from source: unknown 30575 1726867612.82580: variable 'ansible_search_path' from source: unknown 30575 1726867612.82610: calling self._execute() 30575 1726867612.82680: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.82684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.82695: variable 'omit' from source: magic vars 30575 1726867612.82960: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.82969: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.82981: _execute() done 30575 1726867612.82984: dumping result to json 30575 1726867612.82989: done dumping result, returning 30575 1726867612.82995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-0000000010f5] 30575 1726867612.83000: sending task result for task 0affcac9-a3a5-e081-a588-0000000010f5 30575 1726867612.83089: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010f5 30575 1726867612.83092: WORKER PROCESS EXITING 30575 1726867612.83134: no more pending results, returning what we have 30575 1726867612.83139: in VariableManager get_vars() 30575 1726867612.83183: Calling all_inventory to load vars for managed_node3 30575 1726867612.83186: Calling groups_inventory to load vars for managed_node3 30575 1726867612.83188: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.83199: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.83201: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.83204: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.84057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.84923: done with get_vars() 30575 1726867612.84939: variable 'ansible_search_path' from source: unknown 30575 1726867612.84940: variable 'ansible_search_path' from source: unknown 30575 1726867612.84964: we have included files to process 30575 1726867612.84964: generating all_blocks data 30575 1726867612.84965: done generating all_blocks data 30575 1726867612.84967: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867612.84968: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867612.84969: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867612.85434: done processing included file 30575 1726867612.85436: iterating over new_blocks loaded from include file 30575 1726867612.85438: in VariableManager get_vars() 30575 1726867612.85460: done with get_vars() 30575 1726867612.85462: filtering new block on tags 30575 1726867612.85492: done filtering new block on tags 30575 1726867612.85495: in VariableManager get_vars() 30575 1726867612.85516: done with get_vars() 30575 1726867612.85518: filtering new block on tags 30575 1726867612.85561: done filtering new block on tags 30575 1726867612.85564: in VariableManager get_vars() 30575 1726867612.85587: done with get_vars() 30575 1726867612.85589: filtering new block on tags 30575 1726867612.85631: done filtering new block on tags 30575 1726867612.85633: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867612.85638: extending task lists for all hosts with included blocks 30575 1726867612.87285: done extending task lists 30575 1726867612.87287: done processing included files 30575 1726867612.87288: results queue empty 30575 1726867612.87289: checking for any_errors_fatal 30575 1726867612.87292: done checking for any_errors_fatal 30575 1726867612.87293: checking for max_fail_percentage 30575 1726867612.87294: done checking for max_fail_percentage 30575 1726867612.87295: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.87296: done checking to see if all hosts have failed 30575 1726867612.87296: getting the remaining hosts for this loop 30575 1726867612.87298: done getting the remaining hosts for this loop 30575 1726867612.87301: getting the next task for host managed_node3 30575 1726867612.87306: done getting next task for host managed_node3 30575 1726867612.87309: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867612.87313: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.87325: getting variables 30575 1726867612.87326: in VariableManager get_vars() 30575 1726867612.87342: Calling all_inventory to load vars for managed_node3 30575 1726867612.87345: Calling groups_inventory to load vars for managed_node3 30575 1726867612.87347: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.87353: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.87355: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.87358: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.88584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867612.90037: done with get_vars() 30575 1726867612.90059: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:26:52 -0400 (0:00:00.081) 0:00:48.279 ****** 30575 1726867612.90147: entering _queue_task() for managed_node3/setup 30575 1726867612.90508: worker is 1 (out of 1 available) 30575 1726867612.90520: exiting _queue_task() for managed_node3/setup 30575 1726867612.90534: done queuing things up, now waiting for results queue to drain 30575 1726867612.90536: waiting for pending results... 30575 1726867612.90900: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867612.91001: in run() - task 0affcac9-a3a5-e081-a588-000000001152 30575 1726867612.91084: variable 'ansible_search_path' from source: unknown 30575 1726867612.91087: variable 'ansible_search_path' from source: unknown 30575 1726867612.91091: calling self._execute() 30575 1726867612.91158: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867612.91172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867612.91192: variable 'omit' from source: magic vars 30575 1726867612.92083: variable 'ansible_distribution_major_version' from source: facts 30575 1726867612.92087: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867612.92505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867612.95299: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867612.95372: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867612.95421: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867612.95459: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867612.95492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867612.95581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867612.95616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867612.95652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867612.95700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867612.95721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867612.95783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867612.95810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867612.95842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867612.95983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867612.95987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867612.96078: variable '__network_required_facts' from source: role '' defaults 30575 1726867612.96092: variable 'ansible_facts' from source: unknown 30575 1726867612.96845: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867612.96854: when evaluation is False, skipping this task 30575 1726867612.96870: _execute() done 30575 1726867612.96882: dumping result to json 30575 1726867612.96891: done dumping result, returning 30575 1726867612.96905: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000001152] 30575 1726867612.96917: sending task result for task 0affcac9-a3a5-e081-a588-000000001152 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867612.97176: no more pending results, returning what we have 30575 1726867612.97184: results queue empty 30575 1726867612.97185: checking for any_errors_fatal 30575 1726867612.97186: done checking for any_errors_fatal 30575 1726867612.97187: checking for max_fail_percentage 30575 1726867612.97189: done checking for max_fail_percentage 30575 1726867612.97190: checking to see if all hosts have failed and the running result is not ok 30575 1726867612.97191: done checking to see if all hosts have failed 30575 1726867612.97192: getting the remaining hosts for this loop 30575 1726867612.97194: done getting the remaining hosts for this loop 30575 1726867612.97199: getting the next task for host managed_node3 30575 1726867612.97212: done getting next task for host managed_node3 30575 1726867612.97217: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867612.97223: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867612.97248: getting variables 30575 1726867612.97250: in VariableManager get_vars() 30575 1726867612.97403: Calling all_inventory to load vars for managed_node3 30575 1726867612.97405: Calling groups_inventory to load vars for managed_node3 30575 1726867612.97408: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867612.97420: Calling all_plugins_play to load vars for managed_node3 30575 1726867612.97423: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867612.97427: Calling groups_plugins_play to load vars for managed_node3 30575 1726867612.98090: done sending task result for task 0affcac9-a3a5-e081-a588-000000001152 30575 1726867612.98100: WORKER PROCESS EXITING 30575 1726867612.98915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867613.00456: done with get_vars() 30575 1726867613.00482: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:26:53 -0400 (0:00:00.104) 0:00:48.383 ****** 30575 1726867613.00585: entering _queue_task() for managed_node3/stat 30575 1726867613.01011: worker is 1 (out of 1 available) 30575 1726867613.01022: exiting _queue_task() for managed_node3/stat 30575 1726867613.01034: done queuing things up, now waiting for results queue to drain 30575 1726867613.01035: waiting for pending results... 30575 1726867613.01239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867613.01405: in run() - task 0affcac9-a3a5-e081-a588-000000001154 30575 1726867613.01429: variable 'ansible_search_path' from source: unknown 30575 1726867613.01437: variable 'ansible_search_path' from source: unknown 30575 1726867613.01483: calling self._execute() 30575 1726867613.01573: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867613.01591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867613.01606: variable 'omit' from source: magic vars 30575 1726867613.01976: variable 'ansible_distribution_major_version' from source: facts 30575 1726867613.01998: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867613.02161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867613.02460: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867613.02520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867613.02558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867613.02604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867613.02698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867613.02735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867613.02767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867613.02801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867613.02899: variable '__network_is_ostree' from source: set_fact 30575 1726867613.02913: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867613.02923: when evaluation is False, skipping this task 30575 1726867613.02933: _execute() done 30575 1726867613.02941: dumping result to json 30575 1726867613.03083: done dumping result, returning 30575 1726867613.03087: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000001154] 30575 1726867613.03089: sending task result for task 0affcac9-a3a5-e081-a588-000000001154 30575 1726867613.03158: done sending task result for task 0affcac9-a3a5-e081-a588-000000001154 30575 1726867613.03162: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867613.03218: no more pending results, returning what we have 30575 1726867613.03223: results queue empty 30575 1726867613.03224: checking for any_errors_fatal 30575 1726867613.03232: done checking for any_errors_fatal 30575 1726867613.03233: checking for max_fail_percentage 30575 1726867613.03234: done checking for max_fail_percentage 30575 1726867613.03235: checking to see if all hosts have failed and the running result is not ok 30575 1726867613.03237: done checking to see if all hosts have failed 30575 1726867613.03238: getting the remaining hosts for this loop 30575 1726867613.03239: done getting the remaining hosts for this loop 30575 1726867613.03243: getting the next task for host managed_node3 30575 1726867613.03253: done getting next task for host managed_node3 30575 1726867613.03257: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867613.03264: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867613.03289: getting variables 30575 1726867613.03291: in VariableManager get_vars() 30575 1726867613.03333: Calling all_inventory to load vars for managed_node3 30575 1726867613.03336: Calling groups_inventory to load vars for managed_node3 30575 1726867613.03339: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867613.03350: Calling all_plugins_play to load vars for managed_node3 30575 1726867613.03353: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867613.03356: Calling groups_plugins_play to load vars for managed_node3 30575 1726867613.05073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867613.06602: done with get_vars() 30575 1726867613.06627: done getting variables 30575 1726867613.06688: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:26:53 -0400 (0:00:00.061) 0:00:48.444 ****** 30575 1726867613.06727: entering _queue_task() for managed_node3/set_fact 30575 1726867613.07084: worker is 1 (out of 1 available) 30575 1726867613.07097: exiting _queue_task() for managed_node3/set_fact 30575 1726867613.07110: done queuing things up, now waiting for results queue to drain 30575 1726867613.07112: waiting for pending results... 30575 1726867613.07409: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867613.07616: in run() - task 0affcac9-a3a5-e081-a588-000000001155 30575 1726867613.07619: variable 'ansible_search_path' from source: unknown 30575 1726867613.07622: variable 'ansible_search_path' from source: unknown 30575 1726867613.07625: calling self._execute() 30575 1726867613.07709: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867613.07726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867613.07742: variable 'omit' from source: magic vars 30575 1726867613.08123: variable 'ansible_distribution_major_version' from source: facts 30575 1726867613.08141: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867613.08312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867613.08598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867613.08882: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867613.08886: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867613.08888: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867613.08891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867613.08893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867613.08895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867613.08897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867613.08982: variable '__network_is_ostree' from source: set_fact 30575 1726867613.08995: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867613.09003: when evaluation is False, skipping this task 30575 1726867613.09015: _execute() done 30575 1726867613.09023: dumping result to json 30575 1726867613.09031: done dumping result, returning 30575 1726867613.09044: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000001155] 30575 1726867613.09054: sending task result for task 0affcac9-a3a5-e081-a588-000000001155 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867613.09195: no more pending results, returning what we have 30575 1726867613.09199: results queue empty 30575 1726867613.09200: checking for any_errors_fatal 30575 1726867613.09207: done checking for any_errors_fatal 30575 1726867613.09209: checking for max_fail_percentage 30575 1726867613.09210: done checking for max_fail_percentage 30575 1726867613.09211: checking to see if all hosts have failed and the running result is not ok 30575 1726867613.09212: done checking to see if all hosts have failed 30575 1726867613.09213: getting the remaining hosts for this loop 30575 1726867613.09215: done getting the remaining hosts for this loop 30575 1726867613.09219: getting the next task for host managed_node3 30575 1726867613.09232: done getting next task for host managed_node3 30575 1726867613.09236: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867613.09243: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867613.09265: getting variables 30575 1726867613.09267: in VariableManager get_vars() 30575 1726867613.09309: Calling all_inventory to load vars for managed_node3 30575 1726867613.09312: Calling groups_inventory to load vars for managed_node3 30575 1726867613.09314: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867613.09325: Calling all_plugins_play to load vars for managed_node3 30575 1726867613.09328: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867613.09332: Calling groups_plugins_play to load vars for managed_node3 30575 1726867613.10016: done sending task result for task 0affcac9-a3a5-e081-a588-000000001155 30575 1726867613.10020: WORKER PROCESS EXITING 30575 1726867613.10931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867613.12448: done with get_vars() 30575 1726867613.12470: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:26:53 -0400 (0:00:00.058) 0:00:48.503 ****** 30575 1726867613.12568: entering _queue_task() for managed_node3/service_facts 30575 1726867613.12889: worker is 1 (out of 1 available) 30575 1726867613.12901: exiting _queue_task() for managed_node3/service_facts 30575 1726867613.12915: done queuing things up, now waiting for results queue to drain 30575 1726867613.12916: waiting for pending results... 30575 1726867613.13153: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867613.13312: in run() - task 0affcac9-a3a5-e081-a588-000000001157 30575 1726867613.13335: variable 'ansible_search_path' from source: unknown 30575 1726867613.13343: variable 'ansible_search_path' from source: unknown 30575 1726867613.13383: calling self._execute() 30575 1726867613.13482: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867613.13496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867613.13521: variable 'omit' from source: magic vars 30575 1726867613.13981: variable 'ansible_distribution_major_version' from source: facts 30575 1726867613.14000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867613.14011: variable 'omit' from source: magic vars 30575 1726867613.14104: variable 'omit' from source: magic vars 30575 1726867613.14142: variable 'omit' from source: magic vars 30575 1726867613.14196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867613.14237: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867613.14263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867613.14295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867613.14313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867613.14350: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867613.14361: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867613.14369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867613.14491: Set connection var ansible_pipelining to False 30575 1726867613.14494: Set connection var ansible_shell_type to sh 30575 1726867613.14497: Set connection var ansible_shell_executable to /bin/sh 30575 1726867613.14506: Set connection var ansible_timeout to 10 30575 1726867613.14582: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867613.14586: Set connection var ansible_connection to ssh 30575 1726867613.14588: variable 'ansible_shell_executable' from source: unknown 30575 1726867613.14590: variable 'ansible_connection' from source: unknown 30575 1726867613.14594: variable 'ansible_module_compression' from source: unknown 30575 1726867613.14596: variable 'ansible_shell_type' from source: unknown 30575 1726867613.14599: variable 'ansible_shell_executable' from source: unknown 30575 1726867613.14601: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867613.14603: variable 'ansible_pipelining' from source: unknown 30575 1726867613.14605: variable 'ansible_timeout' from source: unknown 30575 1726867613.14607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867613.14800: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867613.14817: variable 'omit' from source: magic vars 30575 1726867613.14831: starting attempt loop 30575 1726867613.14839: running the handler 30575 1726867613.14858: _low_level_execute_command(): starting 30575 1726867613.14983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867613.15598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867613.15616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867613.15653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867613.15672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867613.15763: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867613.15782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867613.15810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867613.15895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867613.17582: stdout chunk (state=3): >>>/root <<< 30575 1726867613.17682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867613.17718: stderr chunk (state=3): >>><<< 30575 1726867613.17721: stdout chunk (state=3): >>><<< 30575 1726867613.17733: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867613.17782: _low_level_execute_command(): starting 30575 1726867613.17786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788 `" && echo ansible-tmp-1726867613.1773777-32972-189760308048788="` echo /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788 `" ) && sleep 0' 30575 1726867613.18194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867613.18207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867613.18211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867613.18222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867613.18225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867613.18227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867613.18290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867613.18307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867613.18340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867613.20238: stdout chunk (state=3): >>>ansible-tmp-1726867613.1773777-32972-189760308048788=/root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788 <<< 30575 1726867613.20349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867613.20373: stderr chunk (state=3): >>><<< 30575 1726867613.20378: stdout chunk (state=3): >>><<< 30575 1726867613.20393: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867613.1773777-32972-189760308048788=/root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867613.20432: variable 'ansible_module_compression' from source: unknown 30575 1726867613.20467: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867613.20500: variable 'ansible_facts' from source: unknown 30575 1726867613.20557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/AnsiballZ_service_facts.py 30575 1726867613.20648: Sending initial data 30575 1726867613.20651: Sent initial data (162 bytes) 30575 1726867613.21264: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867613.21318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867613.21364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867613.22885: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867613.22889: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867613.22932: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867613.22972: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9q5b_0pw /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/AnsiballZ_service_facts.py <<< 30575 1726867613.22980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/AnsiballZ_service_facts.py" <<< 30575 1726867613.23020: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9q5b_0pw" to remote "/root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/AnsiballZ_service_facts.py" <<< 30575 1726867613.23023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/AnsiballZ_service_facts.py" <<< 30575 1726867613.23648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867613.23674: stderr chunk (state=3): >>><<< 30575 1726867613.23859: stdout chunk (state=3): >>><<< 30575 1726867613.23863: done transferring module to remote 30575 1726867613.23866: _low_level_execute_command(): starting 30575 1726867613.23868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/ /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/AnsiballZ_service_facts.py && sleep 0' 30575 1726867613.24409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867613.24484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867613.24534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867613.26243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867613.26268: stderr chunk (state=3): >>><<< 30575 1726867613.26271: stdout chunk (state=3): >>><<< 30575 1726867613.26286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867613.26289: _low_level_execute_command(): starting 30575 1726867613.26294: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/AnsiballZ_service_facts.py && sleep 0' 30575 1726867613.26683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867613.26739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867613.26742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867613.26746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867613.26816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867613.26879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867613.26899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867614.79127: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30575 1726867614.79145: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30575 1726867614.79189: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867614.80652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867614.80798: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867614.80801: stdout chunk (state=3): >>><<< 30575 1726867614.80804: stderr chunk (state=3): >>><<< 30575 1726867614.80810: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867614.82148: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867614.82162: _low_level_execute_command(): starting 30575 1726867614.82186: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867613.1773777-32972-189760308048788/ > /dev/null 2>&1 && sleep 0' 30575 1726867614.82657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867614.82664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867614.82684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867614.82733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867614.82737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867614.82741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867614.82791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867614.84609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867614.84644: stderr chunk (state=3): >>><<< 30575 1726867614.84647: stdout chunk (state=3): >>><<< 30575 1726867614.84656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867614.84663: handler run complete 30575 1726867614.84783: variable 'ansible_facts' from source: unknown 30575 1726867614.84876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867614.85147: variable 'ansible_facts' from source: unknown 30575 1726867614.85229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867614.85342: attempt loop complete, returning result 30575 1726867614.85345: _execute() done 30575 1726867614.85348: dumping result to json 30575 1726867614.85387: done dumping result, returning 30575 1726867614.85396: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000001157] 30575 1726867614.85399: sending task result for task 0affcac9-a3a5-e081-a588-000000001157 30575 1726867614.86113: done sending task result for task 0affcac9-a3a5-e081-a588-000000001157 30575 1726867614.86118: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867614.86165: no more pending results, returning what we have 30575 1726867614.86167: results queue empty 30575 1726867614.86168: checking for any_errors_fatal 30575 1726867614.86170: done checking for any_errors_fatal 30575 1726867614.86170: checking for max_fail_percentage 30575 1726867614.86171: done checking for max_fail_percentage 30575 1726867614.86172: checking to see if all hosts have failed and the running result is not ok 30575 1726867614.86173: done checking to see if all hosts have failed 30575 1726867614.86173: getting the remaining hosts for this loop 30575 1726867614.86174: done getting the remaining hosts for this loop 30575 1726867614.86176: getting the next task for host managed_node3 30575 1726867614.86184: done getting next task for host managed_node3 30575 1726867614.86186: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867614.86191: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867614.86198: getting variables 30575 1726867614.86199: in VariableManager get_vars() 30575 1726867614.86222: Calling all_inventory to load vars for managed_node3 30575 1726867614.86224: Calling groups_inventory to load vars for managed_node3 30575 1726867614.86225: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867614.86231: Calling all_plugins_play to load vars for managed_node3 30575 1726867614.86233: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867614.86235: Calling groups_plugins_play to load vars for managed_node3 30575 1726867614.87252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867614.88459: done with get_vars() 30575 1726867614.88475: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:26:54 -0400 (0:00:01.759) 0:00:50.263 ****** 30575 1726867614.88547: entering _queue_task() for managed_node3/package_facts 30575 1726867614.88786: worker is 1 (out of 1 available) 30575 1726867614.88799: exiting _queue_task() for managed_node3/package_facts 30575 1726867614.88813: done queuing things up, now waiting for results queue to drain 30575 1726867614.88814: waiting for pending results... 30575 1726867614.89002: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867614.89097: in run() - task 0affcac9-a3a5-e081-a588-000000001158 30575 1726867614.89108: variable 'ansible_search_path' from source: unknown 30575 1726867614.89112: variable 'ansible_search_path' from source: unknown 30575 1726867614.89140: calling self._execute() 30575 1726867614.89219: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867614.89224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867614.89230: variable 'omit' from source: magic vars 30575 1726867614.89573: variable 'ansible_distribution_major_version' from source: facts 30575 1726867614.89690: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867614.89693: variable 'omit' from source: magic vars 30575 1726867614.89694: variable 'omit' from source: magic vars 30575 1726867614.89715: variable 'omit' from source: magic vars 30575 1726867614.89759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867614.89803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867614.89827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867614.89848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867614.89864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867614.89901: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867614.89915: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867614.89923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867614.90019: Set connection var ansible_pipelining to False 30575 1726867614.90029: Set connection var ansible_shell_type to sh 30575 1726867614.90044: Set connection var ansible_shell_executable to /bin/sh 30575 1726867614.90059: Set connection var ansible_timeout to 10 30575 1726867614.90069: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867614.90086: Set connection var ansible_connection to ssh 30575 1726867614.90131: variable 'ansible_shell_executable' from source: unknown 30575 1726867614.90134: variable 'ansible_connection' from source: unknown 30575 1726867614.90138: variable 'ansible_module_compression' from source: unknown 30575 1726867614.90140: variable 'ansible_shell_type' from source: unknown 30575 1726867614.90143: variable 'ansible_shell_executable' from source: unknown 30575 1726867614.90145: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867614.90147: variable 'ansible_pipelining' from source: unknown 30575 1726867614.90148: variable 'ansible_timeout' from source: unknown 30575 1726867614.90150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867614.90383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867614.90388: variable 'omit' from source: magic vars 30575 1726867614.90390: starting attempt loop 30575 1726867614.90396: running the handler 30575 1726867614.90455: _low_level_execute_command(): starting 30575 1726867614.90458: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867614.91149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867614.91165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867614.91186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867614.91228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867614.91248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867614.91293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867614.92944: stdout chunk (state=3): >>>/root <<< 30575 1726867614.93094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867614.93097: stdout chunk (state=3): >>><<< 30575 1726867614.93099: stderr chunk (state=3): >>><<< 30575 1726867614.93201: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867614.93205: _low_level_execute_command(): starting 30575 1726867614.93215: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124 `" && echo ansible-tmp-1726867614.931228-33048-267033641009124="` echo /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124 `" ) && sleep 0' 30575 1726867614.93766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867614.93775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867614.93815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867614.95674: stdout chunk (state=3): >>>ansible-tmp-1726867614.931228-33048-267033641009124=/root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124 <<< 30575 1726867614.95813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867614.95849: stderr chunk (state=3): >>><<< 30575 1726867614.95853: stdout chunk (state=3): >>><<< 30575 1726867614.95856: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867614.931228-33048-267033641009124=/root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867614.95955: variable 'ansible_module_compression' from source: unknown 30575 1726867614.95958: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867614.96026: variable 'ansible_facts' from source: unknown 30575 1726867614.96263: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/AnsiballZ_package_facts.py 30575 1726867614.96375: Sending initial data 30575 1726867614.96381: Sent initial data (161 bytes) 30575 1726867614.96994: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867614.97023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867614.97109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867614.98650: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867614.98697: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867614.98739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp2rt3zpt8 /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/AnsiballZ_package_facts.py <<< 30575 1726867614.98742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/AnsiballZ_package_facts.py" <<< 30575 1726867614.98962: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp2rt3zpt8" to remote "/root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/AnsiballZ_package_facts.py" <<< 30575 1726867615.01532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867615.01574: stdout chunk (state=3): >>><<< 30575 1726867615.01580: stderr chunk (state=3): >>><<< 30575 1726867615.01603: done transferring module to remote 30575 1726867615.01624: _low_level_execute_command(): starting 30575 1726867615.01663: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/ /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/AnsiballZ_package_facts.py && sleep 0' 30575 1726867615.02330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867615.02348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867615.02371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867615.02375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867615.02380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867615.02439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867615.02442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867615.02486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867615.04248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867615.04288: stderr chunk (state=3): >>><<< 30575 1726867615.04291: stdout chunk (state=3): >>><<< 30575 1726867615.04383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867615.04395: _low_level_execute_command(): starting 30575 1726867615.04398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/AnsiballZ_package_facts.py && sleep 0' 30575 1726867615.04927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867615.04949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867615.04978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867615.05072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867615.05100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867615.05141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867615.49357: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30575 1726867615.49385: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30575 1726867615.49687: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30575 1726867615.49699: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867615.49717: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867615.51335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867615.51387: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867615.51396: stdout chunk (state=3): >>><<< 30575 1726867615.51405: stderr chunk (state=3): >>><<< 30575 1726867615.51446: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867615.55365: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867615.55461: _low_level_execute_command(): starting 30575 1726867615.55471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867614.931228-33048-267033641009124/ > /dev/null 2>&1 && sleep 0' 30575 1726867615.56069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867615.56085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867615.56136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867615.56151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867615.56245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867615.56268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867615.56285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867615.56368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867615.58256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867615.58265: stdout chunk (state=3): >>><<< 30575 1726867615.58275: stderr chunk (state=3): >>><<< 30575 1726867615.58300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867615.58314: handler run complete 30575 1726867615.59237: variable 'ansible_facts' from source: unknown 30575 1726867615.59674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867615.62483: variable 'ansible_facts' from source: unknown 30575 1726867615.62798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867615.63453: attempt loop complete, returning result 30575 1726867615.63469: _execute() done 30575 1726867615.63476: dumping result to json 30575 1726867615.63708: done dumping result, returning 30575 1726867615.63724: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000001158] 30575 1726867615.63738: sending task result for task 0affcac9-a3a5-e081-a588-000000001158 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867615.67012: no more pending results, returning what we have 30575 1726867615.67018: results queue empty 30575 1726867615.67019: checking for any_errors_fatal 30575 1726867615.67024: done checking for any_errors_fatal 30575 1726867615.67025: checking for max_fail_percentage 30575 1726867615.67027: done checking for max_fail_percentage 30575 1726867615.67027: checking to see if all hosts have failed and the running result is not ok 30575 1726867615.67028: done checking to see if all hosts have failed 30575 1726867615.67029: getting the remaining hosts for this loop 30575 1726867615.67030: done getting the remaining hosts for this loop 30575 1726867615.67034: getting the next task for host managed_node3 30575 1726867615.67043: done getting next task for host managed_node3 30575 1726867615.67047: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867615.67112: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867615.67127: done sending task result for task 0affcac9-a3a5-e081-a588-000000001158 30575 1726867615.67130: WORKER PROCESS EXITING 30575 1726867615.67139: getting variables 30575 1726867615.67141: in VariableManager get_vars() 30575 1726867615.67170: Calling all_inventory to load vars for managed_node3 30575 1726867615.67172: Calling groups_inventory to load vars for managed_node3 30575 1726867615.67175: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867615.67186: Calling all_plugins_play to load vars for managed_node3 30575 1726867615.67189: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867615.67191: Calling groups_plugins_play to load vars for managed_node3 30575 1726867615.69014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867615.71608: done with get_vars() 30575 1726867615.71634: done getting variables 30575 1726867615.71705: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:26:55 -0400 (0:00:00.831) 0:00:51.095 ****** 30575 1726867615.71747: entering _queue_task() for managed_node3/debug 30575 1726867615.72137: worker is 1 (out of 1 available) 30575 1726867615.72149: exiting _queue_task() for managed_node3/debug 30575 1726867615.72163: done queuing things up, now waiting for results queue to drain 30575 1726867615.72164: waiting for pending results... 30575 1726867615.72473: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867615.72623: in run() - task 0affcac9-a3a5-e081-a588-0000000010f6 30575 1726867615.72646: variable 'ansible_search_path' from source: unknown 30575 1726867615.72664: variable 'ansible_search_path' from source: unknown 30575 1726867615.72706: calling self._execute() 30575 1726867615.72838: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867615.72849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867615.72863: variable 'omit' from source: magic vars 30575 1726867615.73584: variable 'ansible_distribution_major_version' from source: facts 30575 1726867615.73601: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867615.73620: variable 'omit' from source: magic vars 30575 1726867615.73686: variable 'omit' from source: magic vars 30575 1726867615.73841: variable 'network_provider' from source: set_fact 30575 1726867615.73845: variable 'omit' from source: magic vars 30575 1726867615.73873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867615.73915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867615.73951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867615.73975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867615.73994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867615.74030: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867615.74058: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867615.74064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867615.74182: Set connection var ansible_pipelining to False 30575 1726867615.74186: Set connection var ansible_shell_type to sh 30575 1726867615.74483: Set connection var ansible_shell_executable to /bin/sh 30575 1726867615.74488: Set connection var ansible_timeout to 10 30575 1726867615.74491: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867615.74493: Set connection var ansible_connection to ssh 30575 1726867615.74495: variable 'ansible_shell_executable' from source: unknown 30575 1726867615.74497: variable 'ansible_connection' from source: unknown 30575 1726867615.74499: variable 'ansible_module_compression' from source: unknown 30575 1726867615.74501: variable 'ansible_shell_type' from source: unknown 30575 1726867615.74503: variable 'ansible_shell_executable' from source: unknown 30575 1726867615.74505: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867615.74507: variable 'ansible_pipelining' from source: unknown 30575 1726867615.74509: variable 'ansible_timeout' from source: unknown 30575 1726867615.74511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867615.74847: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867615.74851: variable 'omit' from source: magic vars 30575 1726867615.74853: starting attempt loop 30575 1726867615.74855: running the handler 30575 1726867615.75066: handler run complete 30575 1726867615.75071: attempt loop complete, returning result 30575 1726867615.75074: _execute() done 30575 1726867615.75076: dumping result to json 30575 1726867615.75081: done dumping result, returning 30575 1726867615.75084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-0000000010f6] 30575 1726867615.75086: sending task result for task 0affcac9-a3a5-e081-a588-0000000010f6 30575 1726867615.75281: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010f6 ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867615.75360: no more pending results, returning what we have 30575 1726867615.75364: results queue empty 30575 1726867615.75365: checking for any_errors_fatal 30575 1726867615.75380: done checking for any_errors_fatal 30575 1726867615.75381: checking for max_fail_percentage 30575 1726867615.75383: done checking for max_fail_percentage 30575 1726867615.75384: checking to see if all hosts have failed and the running result is not ok 30575 1726867615.75385: done checking to see if all hosts have failed 30575 1726867615.75390: getting the remaining hosts for this loop 30575 1726867615.75391: done getting the remaining hosts for this loop 30575 1726867615.75396: getting the next task for host managed_node3 30575 1726867615.75405: done getting next task for host managed_node3 30575 1726867615.75409: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867615.75415: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867615.75431: getting variables 30575 1726867615.75433: in VariableManager get_vars() 30575 1726867615.75473: Calling all_inventory to load vars for managed_node3 30575 1726867615.75476: Calling groups_inventory to load vars for managed_node3 30575 1726867615.75686: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867615.75700: Calling all_plugins_play to load vars for managed_node3 30575 1726867615.75704: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867615.75707: Calling groups_plugins_play to load vars for managed_node3 30575 1726867615.76302: WORKER PROCESS EXITING 30575 1726867615.77524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867615.79183: done with get_vars() 30575 1726867615.79205: done getting variables 30575 1726867615.79272: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:26:55 -0400 (0:00:00.075) 0:00:51.170 ****** 30575 1726867615.79318: entering _queue_task() for managed_node3/fail 30575 1726867615.79673: worker is 1 (out of 1 available) 30575 1726867615.79689: exiting _queue_task() for managed_node3/fail 30575 1726867615.79706: done queuing things up, now waiting for results queue to drain 30575 1726867615.79708: waiting for pending results... 30575 1726867615.79945: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867615.80095: in run() - task 0affcac9-a3a5-e081-a588-0000000010f7 30575 1726867615.80120: variable 'ansible_search_path' from source: unknown 30575 1726867615.80130: variable 'ansible_search_path' from source: unknown 30575 1726867615.80175: calling self._execute() 30575 1726867615.80287: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867615.80300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867615.80483: variable 'omit' from source: magic vars 30575 1726867615.80697: variable 'ansible_distribution_major_version' from source: facts 30575 1726867615.80719: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867615.80840: variable 'network_state' from source: role '' defaults 30575 1726867615.80855: Evaluated conditional (network_state != {}): False 30575 1726867615.80863: when evaluation is False, skipping this task 30575 1726867615.80870: _execute() done 30575 1726867615.80879: dumping result to json 30575 1726867615.80888: done dumping result, returning 30575 1726867615.80900: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-0000000010f7] 30575 1726867615.80911: sending task result for task 0affcac9-a3a5-e081-a588-0000000010f7 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867615.81076: no more pending results, returning what we have 30575 1726867615.81083: results queue empty 30575 1726867615.81084: checking for any_errors_fatal 30575 1726867615.81093: done checking for any_errors_fatal 30575 1726867615.81094: checking for max_fail_percentage 30575 1726867615.81095: done checking for max_fail_percentage 30575 1726867615.81096: checking to see if all hosts have failed and the running result is not ok 30575 1726867615.81097: done checking to see if all hosts have failed 30575 1726867615.81098: getting the remaining hosts for this loop 30575 1726867615.81100: done getting the remaining hosts for this loop 30575 1726867615.81103: getting the next task for host managed_node3 30575 1726867615.81113: done getting next task for host managed_node3 30575 1726867615.81117: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867615.81123: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867615.81148: getting variables 30575 1726867615.81150: in VariableManager get_vars() 30575 1726867615.81190: Calling all_inventory to load vars for managed_node3 30575 1726867615.81193: Calling groups_inventory to load vars for managed_node3 30575 1726867615.81195: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867615.81207: Calling all_plugins_play to load vars for managed_node3 30575 1726867615.81210: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867615.81213: Calling groups_plugins_play to load vars for managed_node3 30575 1726867615.81890: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010f7 30575 1726867615.81894: WORKER PROCESS EXITING 30575 1726867615.82730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867615.84323: done with get_vars() 30575 1726867615.84343: done getting variables 30575 1726867615.84398: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:26:55 -0400 (0:00:00.051) 0:00:51.221 ****** 30575 1726867615.84431: entering _queue_task() for managed_node3/fail 30575 1726867615.84719: worker is 1 (out of 1 available) 30575 1726867615.84730: exiting _queue_task() for managed_node3/fail 30575 1726867615.84743: done queuing things up, now waiting for results queue to drain 30575 1726867615.84744: waiting for pending results... 30575 1726867615.85017: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867615.85161: in run() - task 0affcac9-a3a5-e081-a588-0000000010f8 30575 1726867615.85180: variable 'ansible_search_path' from source: unknown 30575 1726867615.85188: variable 'ansible_search_path' from source: unknown 30575 1726867615.85232: calling self._execute() 30575 1726867615.85382: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867615.85385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867615.85387: variable 'omit' from source: magic vars 30575 1726867615.85692: variable 'ansible_distribution_major_version' from source: facts 30575 1726867615.85708: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867615.85831: variable 'network_state' from source: role '' defaults 30575 1726867615.85858: Evaluated conditional (network_state != {}): False 30575 1726867615.85866: when evaluation is False, skipping this task 30575 1726867615.85873: _execute() done 30575 1726867615.85883: dumping result to json 30575 1726867615.85892: done dumping result, returning 30575 1726867615.85904: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-0000000010f8] 30575 1726867615.85982: sending task result for task 0affcac9-a3a5-e081-a588-0000000010f8 30575 1726867615.86051: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010f8 30575 1726867615.86055: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867615.86106: no more pending results, returning what we have 30575 1726867615.86111: results queue empty 30575 1726867615.86112: checking for any_errors_fatal 30575 1726867615.86124: done checking for any_errors_fatal 30575 1726867615.86125: checking for max_fail_percentage 30575 1726867615.86127: done checking for max_fail_percentage 30575 1726867615.86128: checking to see if all hosts have failed and the running result is not ok 30575 1726867615.86129: done checking to see if all hosts have failed 30575 1726867615.86129: getting the remaining hosts for this loop 30575 1726867615.86131: done getting the remaining hosts for this loop 30575 1726867615.86134: getting the next task for host managed_node3 30575 1726867615.86144: done getting next task for host managed_node3 30575 1726867615.86147: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867615.86153: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867615.86179: getting variables 30575 1726867615.86181: in VariableManager get_vars() 30575 1726867615.86218: Calling all_inventory to load vars for managed_node3 30575 1726867615.86221: Calling groups_inventory to load vars for managed_node3 30575 1726867615.86224: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867615.86236: Calling all_plugins_play to load vars for managed_node3 30575 1726867615.86240: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867615.86243: Calling groups_plugins_play to load vars for managed_node3 30575 1726867615.87689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867615.89216: done with get_vars() 30575 1726867615.89237: done getting variables 30575 1726867615.89295: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:26:55 -0400 (0:00:00.048) 0:00:51.270 ****** 30575 1726867615.89328: entering _queue_task() for managed_node3/fail 30575 1726867615.89603: worker is 1 (out of 1 available) 30575 1726867615.89615: exiting _queue_task() for managed_node3/fail 30575 1726867615.89626: done queuing things up, now waiting for results queue to drain 30575 1726867615.89628: waiting for pending results... 30575 1726867615.90005: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867615.90038: in run() - task 0affcac9-a3a5-e081-a588-0000000010f9 30575 1726867615.90058: variable 'ansible_search_path' from source: unknown 30575 1726867615.90068: variable 'ansible_search_path' from source: unknown 30575 1726867615.90283: calling self._execute() 30575 1726867615.90287: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867615.90290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867615.90293: variable 'omit' from source: magic vars 30575 1726867615.90600: variable 'ansible_distribution_major_version' from source: facts 30575 1726867615.90617: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867615.90801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867615.93143: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867615.93147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867615.93160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867615.93201: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867615.93233: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867615.93322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867615.93361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867615.93395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867615.93438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867615.93459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867615.93557: variable 'ansible_distribution_major_version' from source: facts 30575 1726867615.93582: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867615.93694: variable 'ansible_distribution' from source: facts 30575 1726867615.93704: variable '__network_rh_distros' from source: role '' defaults 30575 1726867615.93717: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867615.93969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867615.94000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867615.94034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867615.94118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867615.94121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867615.94151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867615.94183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867615.94212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867615.94257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867615.94274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867615.94334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867615.94350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867615.94382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867615.94442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867615.94447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867615.94783: variable 'network_connections' from source: include params 30575 1726867615.94879: variable 'interface' from source: play vars 30575 1726867615.94882: variable 'interface' from source: play vars 30575 1726867615.94887: variable 'network_state' from source: role '' defaults 30575 1726867615.94956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867615.95121: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867615.95161: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867615.95198: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867615.95234: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867615.95281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867615.95482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867615.95493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867615.95496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867615.95498: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867615.95500: when evaluation is False, skipping this task 30575 1726867615.95502: _execute() done 30575 1726867615.95504: dumping result to json 30575 1726867615.95506: done dumping result, returning 30575 1726867615.95508: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-0000000010f9] 30575 1726867615.95510: sending task result for task 0affcac9-a3a5-e081-a588-0000000010f9 30575 1726867615.95573: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010f9 30575 1726867615.95576: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867615.95624: no more pending results, returning what we have 30575 1726867615.95627: results queue empty 30575 1726867615.95628: checking for any_errors_fatal 30575 1726867615.95636: done checking for any_errors_fatal 30575 1726867615.95637: checking for max_fail_percentage 30575 1726867615.95639: done checking for max_fail_percentage 30575 1726867615.95640: checking to see if all hosts have failed and the running result is not ok 30575 1726867615.95641: done checking to see if all hosts have failed 30575 1726867615.95641: getting the remaining hosts for this loop 30575 1726867615.95643: done getting the remaining hosts for this loop 30575 1726867615.95647: getting the next task for host managed_node3 30575 1726867615.95656: done getting next task for host managed_node3 30575 1726867615.95660: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867615.95666: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867615.95689: getting variables 30575 1726867615.95691: in VariableManager get_vars() 30575 1726867615.95728: Calling all_inventory to load vars for managed_node3 30575 1726867615.95730: Calling groups_inventory to load vars for managed_node3 30575 1726867615.95733: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867615.95744: Calling all_plugins_play to load vars for managed_node3 30575 1726867615.95747: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867615.95751: Calling groups_plugins_play to load vars for managed_node3 30575 1726867615.97434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.04695: done with get_vars() 30575 1726867616.04725: done getting variables 30575 1726867616.04949: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:26:56 -0400 (0:00:00.156) 0:00:51.427 ****** 30575 1726867616.04984: entering _queue_task() for managed_node3/dnf 30575 1726867616.05353: worker is 1 (out of 1 available) 30575 1726867616.05368: exiting _queue_task() for managed_node3/dnf 30575 1726867616.05384: done queuing things up, now waiting for results queue to drain 30575 1726867616.05387: waiting for pending results... 30575 1726867616.05874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867616.06282: in run() - task 0affcac9-a3a5-e081-a588-0000000010fa 30575 1726867616.06287: variable 'ansible_search_path' from source: unknown 30575 1726867616.06290: variable 'ansible_search_path' from source: unknown 30575 1726867616.06294: calling self._execute() 30575 1726867616.06359: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.06375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867616.06394: variable 'omit' from source: magic vars 30575 1726867616.06840: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.06859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867616.07200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867616.09544: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867616.09625: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867616.09665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867616.09701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867616.09729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867616.09811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.09847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.09874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.09917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.09934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.10056: variable 'ansible_distribution' from source: facts 30575 1726867616.10066: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.10082: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867616.10198: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867616.10468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.10472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.10475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.10479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.10482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.10484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.10506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.10530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.10564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.10575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.10622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.10642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.10663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.10702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.10723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.10876: variable 'network_connections' from source: include params 30575 1726867616.10888: variable 'interface' from source: play vars 30575 1726867616.11008: variable 'interface' from source: play vars 30575 1726867616.11012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867616.11197: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867616.11234: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867616.11265: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867616.11292: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867616.11457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867616.11461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867616.11472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.11474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867616.11478: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867616.11692: variable 'network_connections' from source: include params 30575 1726867616.11695: variable 'interface' from source: play vars 30575 1726867616.11755: variable 'interface' from source: play vars 30575 1726867616.11786: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867616.11789: when evaluation is False, skipping this task 30575 1726867616.11796: _execute() done 30575 1726867616.11802: dumping result to json 30575 1726867616.11805: done dumping result, returning 30575 1726867616.11808: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000010fa] 30575 1726867616.11813: sending task result for task 0affcac9-a3a5-e081-a588-0000000010fa 30575 1726867616.12045: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010fa 30575 1726867616.12048: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867616.12095: no more pending results, returning what we have 30575 1726867616.12098: results queue empty 30575 1726867616.12099: checking for any_errors_fatal 30575 1726867616.12105: done checking for any_errors_fatal 30575 1726867616.12105: checking for max_fail_percentage 30575 1726867616.12107: done checking for max_fail_percentage 30575 1726867616.12107: checking to see if all hosts have failed and the running result is not ok 30575 1726867616.12108: done checking to see if all hosts have failed 30575 1726867616.12109: getting the remaining hosts for this loop 30575 1726867616.12110: done getting the remaining hosts for this loop 30575 1726867616.12113: getting the next task for host managed_node3 30575 1726867616.12120: done getting next task for host managed_node3 30575 1726867616.12124: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867616.12128: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867616.12146: getting variables 30575 1726867616.12147: in VariableManager get_vars() 30575 1726867616.12182: Calling all_inventory to load vars for managed_node3 30575 1726867616.12185: Calling groups_inventory to load vars for managed_node3 30575 1726867616.12187: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867616.12195: Calling all_plugins_play to load vars for managed_node3 30575 1726867616.12197: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867616.12200: Calling groups_plugins_play to load vars for managed_node3 30575 1726867616.14054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.15592: done with get_vars() 30575 1726867616.15615: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867616.15692: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:26:56 -0400 (0:00:00.107) 0:00:51.534 ****** 30575 1726867616.15727: entering _queue_task() for managed_node3/yum 30575 1726867616.16050: worker is 1 (out of 1 available) 30575 1726867616.16065: exiting _queue_task() for managed_node3/yum 30575 1726867616.16183: done queuing things up, now waiting for results queue to drain 30575 1726867616.16185: waiting for pending results... 30575 1726867616.16496: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867616.16563: in run() - task 0affcac9-a3a5-e081-a588-0000000010fb 30575 1726867616.16600: variable 'ansible_search_path' from source: unknown 30575 1726867616.16610: variable 'ansible_search_path' from source: unknown 30575 1726867616.16650: calling self._execute() 30575 1726867616.16758: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.16770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867616.16786: variable 'omit' from source: magic vars 30575 1726867616.17161: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.17246: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867616.17360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867616.19593: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867616.19997: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867616.20039: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867616.20080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867616.20112: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867616.20197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.20231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.20266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.20315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.20350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.20433: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.20567: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867616.20571: when evaluation is False, skipping this task 30575 1726867616.20574: _execute() done 30575 1726867616.20580: dumping result to json 30575 1726867616.20582: done dumping result, returning 30575 1726867616.20585: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000010fb] 30575 1726867616.20588: sending task result for task 0affcac9-a3a5-e081-a588-0000000010fb 30575 1726867616.20659: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010fb 30575 1726867616.20662: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867616.20729: no more pending results, returning what we have 30575 1726867616.20732: results queue empty 30575 1726867616.20733: checking for any_errors_fatal 30575 1726867616.20740: done checking for any_errors_fatal 30575 1726867616.20741: checking for max_fail_percentage 30575 1726867616.20743: done checking for max_fail_percentage 30575 1726867616.20744: checking to see if all hosts have failed and the running result is not ok 30575 1726867616.20745: done checking to see if all hosts have failed 30575 1726867616.20746: getting the remaining hosts for this loop 30575 1726867616.20748: done getting the remaining hosts for this loop 30575 1726867616.20752: getting the next task for host managed_node3 30575 1726867616.20761: done getting next task for host managed_node3 30575 1726867616.20765: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867616.20770: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867616.20795: getting variables 30575 1726867616.20797: in VariableManager get_vars() 30575 1726867616.20837: Calling all_inventory to load vars for managed_node3 30575 1726867616.20840: Calling groups_inventory to load vars for managed_node3 30575 1726867616.20842: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867616.20854: Calling all_plugins_play to load vars for managed_node3 30575 1726867616.20857: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867616.20861: Calling groups_plugins_play to load vars for managed_node3 30575 1726867616.22622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.24140: done with get_vars() 30575 1726867616.24162: done getting variables 30575 1726867616.24222: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:26:56 -0400 (0:00:00.085) 0:00:51.620 ****** 30575 1726867616.24256: entering _queue_task() for managed_node3/fail 30575 1726867616.24584: worker is 1 (out of 1 available) 30575 1726867616.24596: exiting _queue_task() for managed_node3/fail 30575 1726867616.24608: done queuing things up, now waiting for results queue to drain 30575 1726867616.24609: waiting for pending results... 30575 1726867616.24898: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867616.25041: in run() - task 0affcac9-a3a5-e081-a588-0000000010fc 30575 1726867616.25059: variable 'ansible_search_path' from source: unknown 30575 1726867616.25066: variable 'ansible_search_path' from source: unknown 30575 1726867616.25107: calling self._execute() 30575 1726867616.25205: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.25220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867616.25323: variable 'omit' from source: magic vars 30575 1726867616.25602: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.25620: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867616.25747: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867616.25950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867616.28189: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867616.28268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867616.28312: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867616.28352: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867616.28389: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867616.28482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.28518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.28549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.28601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.28621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.28691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.28782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.28786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.28789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.28815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.28862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.28895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.28930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.28980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.29012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.29195: variable 'network_connections' from source: include params 30575 1726867616.29216: variable 'interface' from source: play vars 30575 1726867616.29384: variable 'interface' from source: play vars 30575 1726867616.29387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867616.29571: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867616.29628: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867616.29663: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867616.29702: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867616.29791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867616.29819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867616.29853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.29886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867616.29966: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867616.30253: variable 'network_connections' from source: include params 30575 1726867616.30370: variable 'interface' from source: play vars 30575 1726867616.30373: variable 'interface' from source: play vars 30575 1726867616.30375: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867616.30379: when evaluation is False, skipping this task 30575 1726867616.30381: _execute() done 30575 1726867616.30384: dumping result to json 30575 1726867616.30390: done dumping result, returning 30575 1726867616.30401: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000010fc] 30575 1726867616.30410: sending task result for task 0affcac9-a3a5-e081-a588-0000000010fc 30575 1726867616.30724: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010fc 30575 1726867616.30727: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867616.30787: no more pending results, returning what we have 30575 1726867616.30791: results queue empty 30575 1726867616.30792: checking for any_errors_fatal 30575 1726867616.30800: done checking for any_errors_fatal 30575 1726867616.30801: checking for max_fail_percentage 30575 1726867616.30803: done checking for max_fail_percentage 30575 1726867616.30804: checking to see if all hosts have failed and the running result is not ok 30575 1726867616.30805: done checking to see if all hosts have failed 30575 1726867616.30806: getting the remaining hosts for this loop 30575 1726867616.30808: done getting the remaining hosts for this loop 30575 1726867616.30811: getting the next task for host managed_node3 30575 1726867616.30822: done getting next task for host managed_node3 30575 1726867616.30826: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867616.30832: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867616.30859: getting variables 30575 1726867616.30861: in VariableManager get_vars() 30575 1726867616.30900: Calling all_inventory to load vars for managed_node3 30575 1726867616.30902: Calling groups_inventory to load vars for managed_node3 30575 1726867616.30905: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867616.30914: Calling all_plugins_play to load vars for managed_node3 30575 1726867616.30917: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867616.30921: Calling groups_plugins_play to load vars for managed_node3 30575 1726867616.32571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.34286: done with get_vars() 30575 1726867616.34307: done getting variables 30575 1726867616.34374: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:26:56 -0400 (0:00:00.101) 0:00:51.721 ****** 30575 1726867616.34420: entering _queue_task() for managed_node3/package 30575 1726867616.34782: worker is 1 (out of 1 available) 30575 1726867616.34797: exiting _queue_task() for managed_node3/package 30575 1726867616.34813: done queuing things up, now waiting for results queue to drain 30575 1726867616.34815: waiting for pending results... 30575 1726867616.35139: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867616.35306: in run() - task 0affcac9-a3a5-e081-a588-0000000010fd 30575 1726867616.35329: variable 'ansible_search_path' from source: unknown 30575 1726867616.35338: variable 'ansible_search_path' from source: unknown 30575 1726867616.35387: calling self._execute() 30575 1726867616.35500: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.35514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867616.35534: variable 'omit' from source: magic vars 30575 1726867616.35948: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.35966: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867616.36239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867616.36471: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867616.36524: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867616.36566: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867616.36639: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867616.36767: variable 'network_packages' from source: role '' defaults 30575 1726867616.36895: variable '__network_provider_setup' from source: role '' defaults 30575 1726867616.36910: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867616.36974: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867616.36993: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867616.37185: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867616.37247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867616.39694: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867616.39766: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867616.39829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867616.39894: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867616.39932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867616.40040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.40093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.40137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.40196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.40228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.40289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.40326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.40382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.40441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.40588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.40793: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867616.40939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.40970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.41002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.41051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.41072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.41186: variable 'ansible_python' from source: facts 30575 1726867616.41207: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867616.41296: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867616.41390: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867616.41530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.41563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.41599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.41693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.41697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.41735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.41771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.41806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.41849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.41869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.42026: variable 'network_connections' from source: include params 30575 1726867616.42037: variable 'interface' from source: play vars 30575 1726867616.42200: variable 'interface' from source: play vars 30575 1726867616.42284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867616.42353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867616.42403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.42438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867616.42505: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867616.42983: variable 'network_connections' from source: include params 30575 1726867616.42989: variable 'interface' from source: play vars 30575 1726867616.43050: variable 'interface' from source: play vars 30575 1726867616.43137: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867616.43235: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867616.43565: variable 'network_connections' from source: include params 30575 1726867616.43575: variable 'interface' from source: play vars 30575 1726867616.43642: variable 'interface' from source: play vars 30575 1726867616.43675: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867616.43753: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867616.44073: variable 'network_connections' from source: include params 30575 1726867616.44087: variable 'interface' from source: play vars 30575 1726867616.44152: variable 'interface' from source: play vars 30575 1726867616.44219: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867616.44385: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867616.44388: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867616.44391: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867616.44579: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867616.45060: variable 'network_connections' from source: include params 30575 1726867616.45071: variable 'interface' from source: play vars 30575 1726867616.45133: variable 'interface' from source: play vars 30575 1726867616.45147: variable 'ansible_distribution' from source: facts 30575 1726867616.45160: variable '__network_rh_distros' from source: role '' defaults 30575 1726867616.45171: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.45206: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867616.45382: variable 'ansible_distribution' from source: facts 30575 1726867616.45392: variable '__network_rh_distros' from source: role '' defaults 30575 1726867616.45484: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.45487: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867616.45584: variable 'ansible_distribution' from source: facts 30575 1726867616.45599: variable '__network_rh_distros' from source: role '' defaults 30575 1726867616.45610: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.45650: variable 'network_provider' from source: set_fact 30575 1726867616.45673: variable 'ansible_facts' from source: unknown 30575 1726867616.46389: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867616.46397: when evaluation is False, skipping this task 30575 1726867616.46462: _execute() done 30575 1726867616.46465: dumping result to json 30575 1726867616.46467: done dumping result, returning 30575 1726867616.46470: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-0000000010fd] 30575 1726867616.46472: sending task result for task 0affcac9-a3a5-e081-a588-0000000010fd 30575 1726867616.46545: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010fd skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867616.46624: no more pending results, returning what we have 30575 1726867616.46628: results queue empty 30575 1726867616.46629: checking for any_errors_fatal 30575 1726867616.46637: done checking for any_errors_fatal 30575 1726867616.46638: checking for max_fail_percentage 30575 1726867616.46640: done checking for max_fail_percentage 30575 1726867616.46641: checking to see if all hosts have failed and the running result is not ok 30575 1726867616.46642: done checking to see if all hosts have failed 30575 1726867616.46643: getting the remaining hosts for this loop 30575 1726867616.46644: done getting the remaining hosts for this loop 30575 1726867616.46649: getting the next task for host managed_node3 30575 1726867616.46659: done getting next task for host managed_node3 30575 1726867616.46664: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867616.46669: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867616.46693: getting variables 30575 1726867616.46695: in VariableManager get_vars() 30575 1726867616.46737: Calling all_inventory to load vars for managed_node3 30575 1726867616.46744: Calling groups_inventory to load vars for managed_node3 30575 1726867616.46747: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867616.46760: Calling all_plugins_play to load vars for managed_node3 30575 1726867616.46763: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867616.46766: Calling groups_plugins_play to load vars for managed_node3 30575 1726867616.47461: WORKER PROCESS EXITING 30575 1726867616.49770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.52979: done with get_vars() 30575 1726867616.53004: done getting variables 30575 1726867616.53066: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:26:56 -0400 (0:00:00.188) 0:00:51.910 ****** 30575 1726867616.53307: entering _queue_task() for managed_node3/package 30575 1726867616.53870: worker is 1 (out of 1 available) 30575 1726867616.54284: exiting _queue_task() for managed_node3/package 30575 1726867616.54298: done queuing things up, now waiting for results queue to drain 30575 1726867616.54299: waiting for pending results... 30575 1726867616.54506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867616.54834: in run() - task 0affcac9-a3a5-e081-a588-0000000010fe 30575 1726867616.54853: variable 'ansible_search_path' from source: unknown 30575 1726867616.54908: variable 'ansible_search_path' from source: unknown 30575 1726867616.54953: calling self._execute() 30575 1726867616.55259: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.55369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867616.55372: variable 'omit' from source: magic vars 30575 1726867616.55978: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.56034: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867616.56275: variable 'network_state' from source: role '' defaults 30575 1726867616.56355: Evaluated conditional (network_state != {}): False 30575 1726867616.56363: when evaluation is False, skipping this task 30575 1726867616.56369: _execute() done 30575 1726867616.56375: dumping result to json 30575 1726867616.56386: done dumping result, returning 30575 1726867616.56399: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000010fe] 30575 1726867616.56410: sending task result for task 0affcac9-a3a5-e081-a588-0000000010fe 30575 1726867616.56745: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010fe 30575 1726867616.56749: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867616.56826: no more pending results, returning what we have 30575 1726867616.56831: results queue empty 30575 1726867616.56831: checking for any_errors_fatal 30575 1726867616.56838: done checking for any_errors_fatal 30575 1726867616.56839: checking for max_fail_percentage 30575 1726867616.56841: done checking for max_fail_percentage 30575 1726867616.56842: checking to see if all hosts have failed and the running result is not ok 30575 1726867616.56843: done checking to see if all hosts have failed 30575 1726867616.56844: getting the remaining hosts for this loop 30575 1726867616.56845: done getting the remaining hosts for this loop 30575 1726867616.56849: getting the next task for host managed_node3 30575 1726867616.56859: done getting next task for host managed_node3 30575 1726867616.56864: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867616.56870: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867616.56898: getting variables 30575 1726867616.56900: in VariableManager get_vars() 30575 1726867616.56942: Calling all_inventory to load vars for managed_node3 30575 1726867616.56945: Calling groups_inventory to load vars for managed_node3 30575 1726867616.56948: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867616.56961: Calling all_plugins_play to load vars for managed_node3 30575 1726867616.56964: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867616.56967: Calling groups_plugins_play to load vars for managed_node3 30575 1726867616.59873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.63470: done with get_vars() 30575 1726867616.63499: done getting variables 30575 1726867616.63564: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:26:56 -0400 (0:00:00.104) 0:00:52.015 ****** 30575 1726867616.63807: entering _queue_task() for managed_node3/package 30575 1726867616.64430: worker is 1 (out of 1 available) 30575 1726867616.64444: exiting _queue_task() for managed_node3/package 30575 1726867616.64457: done queuing things up, now waiting for results queue to drain 30575 1726867616.64459: waiting for pending results... 30575 1726867616.65269: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867616.65893: in run() - task 0affcac9-a3a5-e081-a588-0000000010ff 30575 1726867616.65898: variable 'ansible_search_path' from source: unknown 30575 1726867616.65901: variable 'ansible_search_path' from source: unknown 30575 1726867616.66011: calling self._execute() 30575 1726867616.66435: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.66440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867616.66442: variable 'omit' from source: magic vars 30575 1726867616.67565: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.67586: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867616.68157: variable 'network_state' from source: role '' defaults 30575 1726867616.68160: Evaluated conditional (network_state != {}): False 30575 1726867616.68163: when evaluation is False, skipping this task 30575 1726867616.68166: _execute() done 30575 1726867616.68168: dumping result to json 30575 1726867616.68171: done dumping result, returning 30575 1726867616.68173: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000010ff] 30575 1726867616.68175: sending task result for task 0affcac9-a3a5-e081-a588-0000000010ff 30575 1726867616.68252: done sending task result for task 0affcac9-a3a5-e081-a588-0000000010ff 30575 1726867616.68257: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867616.68422: no more pending results, returning what we have 30575 1726867616.68425: results queue empty 30575 1726867616.68426: checking for any_errors_fatal 30575 1726867616.68435: done checking for any_errors_fatal 30575 1726867616.68436: checking for max_fail_percentage 30575 1726867616.68437: done checking for max_fail_percentage 30575 1726867616.68438: checking to see if all hosts have failed and the running result is not ok 30575 1726867616.68439: done checking to see if all hosts have failed 30575 1726867616.68440: getting the remaining hosts for this loop 30575 1726867616.68441: done getting the remaining hosts for this loop 30575 1726867616.68445: getting the next task for host managed_node3 30575 1726867616.68453: done getting next task for host managed_node3 30575 1726867616.68457: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867616.68463: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867616.68489: getting variables 30575 1726867616.68491: in VariableManager get_vars() 30575 1726867616.68533: Calling all_inventory to load vars for managed_node3 30575 1726867616.68536: Calling groups_inventory to load vars for managed_node3 30575 1726867616.68538: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867616.68551: Calling all_plugins_play to load vars for managed_node3 30575 1726867616.68554: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867616.68557: Calling groups_plugins_play to load vars for managed_node3 30575 1726867616.71287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.74557: done with get_vars() 30575 1726867616.74786: done getting variables 30575 1726867616.74849: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:26:56 -0400 (0:00:00.112) 0:00:52.128 ****** 30575 1726867616.75091: entering _queue_task() for managed_node3/service 30575 1726867616.75654: worker is 1 (out of 1 available) 30575 1726867616.75667: exiting _queue_task() for managed_node3/service 30575 1726867616.75883: done queuing things up, now waiting for results queue to drain 30575 1726867616.75886: waiting for pending results... 30575 1726867616.76397: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867616.76561: in run() - task 0affcac9-a3a5-e081-a588-000000001100 30575 1726867616.76724: variable 'ansible_search_path' from source: unknown 30575 1726867616.76733: variable 'ansible_search_path' from source: unknown 30575 1726867616.76773: calling self._execute() 30575 1726867616.77251: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.77255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867616.77467: variable 'omit' from source: magic vars 30575 1726867616.78178: variable 'ansible_distribution_major_version' from source: facts 30575 1726867616.78525: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867616.78880: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867616.79360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867616.84519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867616.84691: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867616.84739: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867616.84854: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867616.84889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867616.85145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.85150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.85255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.85304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.85326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.85410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.85498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.85608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.85652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.85702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.85828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867616.85909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867616.86282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.86285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867616.86287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867616.86619: variable 'network_connections' from source: include params 30575 1726867616.86798: variable 'interface' from source: play vars 30575 1726867616.86875: variable 'interface' from source: play vars 30575 1726867616.87052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867616.87945: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867616.88904: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867616.88940: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867616.89382: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867616.89385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867616.89387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867616.89408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867616.89438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867616.89503: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867616.90121: variable 'network_connections' from source: include params 30575 1726867616.90582: variable 'interface' from source: play vars 30575 1726867616.90586: variable 'interface' from source: play vars 30575 1726867616.90982: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867616.90986: when evaluation is False, skipping this task 30575 1726867616.90989: _execute() done 30575 1726867616.90991: dumping result to json 30575 1726867616.90993: done dumping result, returning 30575 1726867616.90996: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001100] 30575 1726867616.90998: sending task result for task 0affcac9-a3a5-e081-a588-000000001100 30575 1726867616.91073: done sending task result for task 0affcac9-a3a5-e081-a588-000000001100 30575 1726867616.91129: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867616.91140: no more pending results, returning what we have 30575 1726867616.91143: results queue empty 30575 1726867616.91144: checking for any_errors_fatal 30575 1726867616.91150: done checking for any_errors_fatal 30575 1726867616.91151: checking for max_fail_percentage 30575 1726867616.91152: done checking for max_fail_percentage 30575 1726867616.91154: checking to see if all hosts have failed and the running result is not ok 30575 1726867616.91155: done checking to see if all hosts have failed 30575 1726867616.91155: getting the remaining hosts for this loop 30575 1726867616.91157: done getting the remaining hosts for this loop 30575 1726867616.91162: getting the next task for host managed_node3 30575 1726867616.91170: done getting next task for host managed_node3 30575 1726867616.91175: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867616.91384: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867616.91407: getting variables 30575 1726867616.91409: in VariableManager get_vars() 30575 1726867616.91451: Calling all_inventory to load vars for managed_node3 30575 1726867616.91453: Calling groups_inventory to load vars for managed_node3 30575 1726867616.91456: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867616.91466: Calling all_plugins_play to load vars for managed_node3 30575 1726867616.91469: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867616.91472: Calling groups_plugins_play to load vars for managed_node3 30575 1726867616.94351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867616.97505: done with get_vars() 30575 1726867616.97532: done getting variables 30575 1726867616.97596: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:26:56 -0400 (0:00:00.225) 0:00:52.353 ****** 30575 1726867616.97633: entering _queue_task() for managed_node3/service 30575 1726867616.98402: worker is 1 (out of 1 available) 30575 1726867616.98419: exiting _queue_task() for managed_node3/service 30575 1726867616.98434: done queuing things up, now waiting for results queue to drain 30575 1726867616.98436: waiting for pending results... 30575 1726867616.99058: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867616.99359: in run() - task 0affcac9-a3a5-e081-a588-000000001101 30575 1726867616.99443: variable 'ansible_search_path' from source: unknown 30575 1726867616.99453: variable 'ansible_search_path' from source: unknown 30575 1726867616.99498: calling self._execute() 30575 1726867616.99683: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867616.99764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867617.00083: variable 'omit' from source: magic vars 30575 1726867617.00550: variable 'ansible_distribution_major_version' from source: facts 30575 1726867617.00568: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867617.00964: variable 'network_provider' from source: set_fact 30575 1726867617.00975: variable 'network_state' from source: role '' defaults 30575 1726867617.00994: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867617.01004: variable 'omit' from source: magic vars 30575 1726867617.01188: variable 'omit' from source: magic vars 30575 1726867617.01222: variable 'network_service_name' from source: role '' defaults 30575 1726867617.01439: variable 'network_service_name' from source: role '' defaults 30575 1726867617.01562: variable '__network_provider_setup' from source: role '' defaults 30575 1726867617.01791: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867617.01865: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867617.01874: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867617.01932: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867617.02560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867617.09049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867617.09131: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867617.09168: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867617.09782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867617.09786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867617.09790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867617.09793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867617.09796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867617.09798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867617.09800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867617.10039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867617.10063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867617.10090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867617.10244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867617.10258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867617.10785: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867617.11027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867617.11051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867617.11075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867617.11332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867617.11349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867617.11554: variable 'ansible_python' from source: facts 30575 1726867617.11570: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867617.11651: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867617.11882: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867617.12123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867617.12393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867617.12396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867617.12399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867617.12401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867617.12821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867617.12850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867617.12865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867617.12905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867617.12921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867617.13356: variable 'network_connections' from source: include params 30575 1726867617.13363: variable 'interface' from source: play vars 30575 1726867617.13435: variable 'interface' from source: play vars 30575 1726867617.13943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867617.14537: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867617.14585: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867617.14627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867617.14667: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867617.15130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867617.15159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867617.15225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867617.15228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867617.15270: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867617.16343: variable 'network_connections' from source: include params 30575 1726867617.16350: variable 'interface' from source: play vars 30575 1726867617.16423: variable 'interface' from source: play vars 30575 1726867617.16527: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867617.16945: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867617.17628: variable 'network_connections' from source: include params 30575 1726867617.17631: variable 'interface' from source: play vars 30575 1726867617.18101: variable 'interface' from source: play vars 30575 1726867617.18124: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867617.18200: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867617.18881: variable 'network_connections' from source: include params 30575 1726867617.19290: variable 'interface' from source: play vars 30575 1726867617.19383: variable 'interface' from source: play vars 30575 1726867617.19420: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867617.19473: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867617.19480: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867617.19940: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867617.20583: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867617.21838: variable 'network_connections' from source: include params 30575 1726867617.21841: variable 'interface' from source: play vars 30575 1726867617.22185: variable 'interface' from source: play vars 30575 1726867617.22190: variable 'ansible_distribution' from source: facts 30575 1726867617.22192: variable '__network_rh_distros' from source: role '' defaults 30575 1726867617.22194: variable 'ansible_distribution_major_version' from source: facts 30575 1726867617.22196: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867617.22717: variable 'ansible_distribution' from source: facts 30575 1726867617.22721: variable '__network_rh_distros' from source: role '' defaults 30575 1726867617.22725: variable 'ansible_distribution_major_version' from source: facts 30575 1726867617.22750: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867617.23322: variable 'ansible_distribution' from source: facts 30575 1726867617.23325: variable '__network_rh_distros' from source: role '' defaults 30575 1726867617.23382: variable 'ansible_distribution_major_version' from source: facts 30575 1726867617.23385: variable 'network_provider' from source: set_fact 30575 1726867617.23388: variable 'omit' from source: magic vars 30575 1726867617.23422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867617.23442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867617.23460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867617.23479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867617.23909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867617.23948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867617.23952: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867617.23954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867617.24058: Set connection var ansible_pipelining to False 30575 1726867617.24061: Set connection var ansible_shell_type to sh 30575 1726867617.24064: Set connection var ansible_shell_executable to /bin/sh 30575 1726867617.24066: Set connection var ansible_timeout to 10 30575 1726867617.24279: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867617.24284: Set connection var ansible_connection to ssh 30575 1726867617.24499: variable 'ansible_shell_executable' from source: unknown 30575 1726867617.24507: variable 'ansible_connection' from source: unknown 30575 1726867617.24513: variable 'ansible_module_compression' from source: unknown 30575 1726867617.24518: variable 'ansible_shell_type' from source: unknown 30575 1726867617.24520: variable 'ansible_shell_executable' from source: unknown 30575 1726867617.24522: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867617.24524: variable 'ansible_pipelining' from source: unknown 30575 1726867617.24526: variable 'ansible_timeout' from source: unknown 30575 1726867617.24528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867617.24624: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867617.24631: variable 'omit' from source: magic vars 30575 1726867617.24638: starting attempt loop 30575 1726867617.24640: running the handler 30575 1726867617.24899: variable 'ansible_facts' from source: unknown 30575 1726867617.26982: _low_level_execute_command(): starting 30575 1726867617.26986: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867617.28495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867617.28508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867617.28528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867617.28805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867617.30525: stdout chunk (state=3): >>>/root <<< 30575 1726867617.30624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867617.30667: stderr chunk (state=3): >>><<< 30575 1726867617.30674: stdout chunk (state=3): >>><<< 30575 1726867617.30700: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867617.30712: _low_level_execute_command(): starting 30575 1726867617.30720: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962 `" && echo ansible-tmp-1726867617.306996-33137-43931612129962="` echo /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962 `" ) && sleep 0' 30575 1726867617.31863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867617.31867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867617.31869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867617.31871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867617.31874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867617.31876: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867617.31880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867617.31883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867617.31887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867617.31890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867617.31891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867617.31893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867617.31975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867617.33922: stdout chunk (state=3): >>>ansible-tmp-1726867617.306996-33137-43931612129962=/root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962 <<< 30575 1726867617.34038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867617.34137: stderr chunk (state=3): >>><<< 30575 1726867617.34141: stdout chunk (state=3): >>><<< 30575 1726867617.34152: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867617.306996-33137-43931612129962=/root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867617.34191: variable 'ansible_module_compression' from source: unknown 30575 1726867617.34245: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867617.34518: variable 'ansible_facts' from source: unknown 30575 1726867617.34837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/AnsiballZ_systemd.py 30575 1726867617.35161: Sending initial data 30575 1726867617.35165: Sent initial data (154 bytes) 30575 1726867617.36586: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867617.36995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867617.37162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867617.37296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867617.38887: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30575 1726867617.38893: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867617.38956: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867617.39012: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpeme8hree /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/AnsiballZ_systemd.py <<< 30575 1726867617.39025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/AnsiballZ_systemd.py" <<< 30575 1726867617.39057: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpeme8hree" to remote "/root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/AnsiballZ_systemd.py" <<< 30575 1726867617.42931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867617.42971: stderr chunk (state=3): >>><<< 30575 1726867617.42984: stdout chunk (state=3): >>><<< 30575 1726867617.43119: done transferring module to remote 30575 1726867617.43123: _low_level_execute_command(): starting 30575 1726867617.43140: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/ /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/AnsiballZ_systemd.py && sleep 0' 30575 1726867617.44483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867617.44503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867617.44522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867617.44541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867617.44567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867617.44665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867617.44792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867617.44956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867617.46808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867617.47061: stderr chunk (state=3): >>><<< 30575 1726867617.47381: stdout chunk (state=3): >>><<< 30575 1726867617.47384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867617.47387: _low_level_execute_command(): starting 30575 1726867617.47389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/AnsiballZ_systemd.py && sleep 0' 30575 1726867617.49096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867617.49213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867617.49234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867617.49250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867617.49496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867617.79219: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10514432", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314499584", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1843836000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system<<< 30575 1726867617.79303: stdout chunk (state=3): >>>.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867617.81246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867617.81257: stdout chunk (state=3): >>><<< 30575 1726867617.81270: stderr chunk (state=3): >>><<< 30575 1726867617.81297: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10514432", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314499584", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1843836000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867617.81873: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867617.81888: _low_level_execute_command(): starting 30575 1726867617.81891: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867617.306996-33137-43931612129962/ > /dev/null 2>&1 && sleep 0' 30575 1726867617.83011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867617.83027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867617.83072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867617.83090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867617.83105: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867617.83147: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867617.83285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867617.83321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867617.83358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867617.83462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867617.85420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867617.85430: stdout chunk (state=3): >>><<< 30575 1726867617.85440: stderr chunk (state=3): >>><<< 30575 1726867617.85463: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867617.85683: handler run complete 30575 1726867617.85686: attempt loop complete, returning result 30575 1726867617.85689: _execute() done 30575 1726867617.85691: dumping result to json 30575 1726867617.85693: done dumping result, returning 30575 1726867617.85695: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000001101] 30575 1726867617.85809: sending task result for task 0affcac9-a3a5-e081-a588-000000001101 30575 1726867617.86548: done sending task result for task 0affcac9-a3a5-e081-a588-000000001101 30575 1726867617.86552: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867617.86718: no more pending results, returning what we have 30575 1726867617.86722: results queue empty 30575 1726867617.86723: checking for any_errors_fatal 30575 1726867617.86732: done checking for any_errors_fatal 30575 1726867617.86733: checking for max_fail_percentage 30575 1726867617.86735: done checking for max_fail_percentage 30575 1726867617.86736: checking to see if all hosts have failed and the running result is not ok 30575 1726867617.86737: done checking to see if all hosts have failed 30575 1726867617.86737: getting the remaining hosts for this loop 30575 1726867617.86739: done getting the remaining hosts for this loop 30575 1726867617.86744: getting the next task for host managed_node3 30575 1726867617.86754: done getting next task for host managed_node3 30575 1726867617.86768: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867617.86774: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867617.86788: getting variables 30575 1726867617.86790: in VariableManager get_vars() 30575 1726867617.86825: Calling all_inventory to load vars for managed_node3 30575 1726867617.86828: Calling groups_inventory to load vars for managed_node3 30575 1726867617.86830: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867617.86839: Calling all_plugins_play to load vars for managed_node3 30575 1726867617.86842: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867617.86845: Calling groups_plugins_play to load vars for managed_node3 30575 1726867617.90466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867617.93901: done with get_vars() 30575 1726867617.93933: done getting variables 30575 1726867617.94149: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:26:57 -0400 (0:00:00.965) 0:00:53.319 ****** 30575 1726867617.94201: entering _queue_task() for managed_node3/service 30575 1726867617.94838: worker is 1 (out of 1 available) 30575 1726867617.94851: exiting _queue_task() for managed_node3/service 30575 1726867617.94864: done queuing things up, now waiting for results queue to drain 30575 1726867617.94866: waiting for pending results... 30575 1726867617.95234: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867617.95441: in run() - task 0affcac9-a3a5-e081-a588-000000001102 30575 1726867617.95445: variable 'ansible_search_path' from source: unknown 30575 1726867617.95448: variable 'ansible_search_path' from source: unknown 30575 1726867617.95451: calling self._execute() 30575 1726867617.95550: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867617.95562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867617.95586: variable 'omit' from source: magic vars 30575 1726867617.96010: variable 'ansible_distribution_major_version' from source: facts 30575 1726867617.96027: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867617.96159: variable 'network_provider' from source: set_fact 30575 1726867617.96171: Evaluated conditional (network_provider == "nm"): True 30575 1726867617.96282: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867617.96457: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867617.96640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867618.00313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867618.00591: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867618.00596: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867618.00826: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867618.00830: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867618.00960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867618.01023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867618.01062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867618.01224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867618.01228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867618.01308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867618.01406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867618.01465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867618.01520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867618.01539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867618.01613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867618.01804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867618.01844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867618.01850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867618.01900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867618.02421: variable 'network_connections' from source: include params 30575 1726867618.02425: variable 'interface' from source: play vars 30575 1726867618.02511: variable 'interface' from source: play vars 30575 1726867618.02781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867618.03132: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867618.03238: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867618.03362: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867618.03517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867618.03620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867618.03623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867618.03740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867618.03776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867618.03845: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867618.04131: variable 'network_connections' from source: include params 30575 1726867618.04142: variable 'interface' from source: play vars 30575 1726867618.04221: variable 'interface' from source: play vars 30575 1726867618.04273: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867618.04283: when evaluation is False, skipping this task 30575 1726867618.04304: _execute() done 30575 1726867618.04317: dumping result to json 30575 1726867618.04326: done dumping result, returning 30575 1726867618.04367: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000001102] 30575 1726867618.04392: sending task result for task 0affcac9-a3a5-e081-a588-000000001102 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867618.04631: no more pending results, returning what we have 30575 1726867618.04635: results queue empty 30575 1726867618.04636: checking for any_errors_fatal 30575 1726867618.04659: done checking for any_errors_fatal 30575 1726867618.04660: checking for max_fail_percentage 30575 1726867618.04662: done checking for max_fail_percentage 30575 1726867618.04663: checking to see if all hosts have failed and the running result is not ok 30575 1726867618.04664: done checking to see if all hosts have failed 30575 1726867618.04664: getting the remaining hosts for this loop 30575 1726867618.04666: done getting the remaining hosts for this loop 30575 1726867618.04670: getting the next task for host managed_node3 30575 1726867618.04681: done getting next task for host managed_node3 30575 1726867618.04685: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867618.04885: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867618.05085: getting variables 30575 1726867618.05088: in VariableManager get_vars() 30575 1726867618.05125: Calling all_inventory to load vars for managed_node3 30575 1726867618.05128: Calling groups_inventory to load vars for managed_node3 30575 1726867618.05130: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867618.05140: Calling all_plugins_play to load vars for managed_node3 30575 1726867618.05142: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867618.05145: Calling groups_plugins_play to load vars for managed_node3 30575 1726867618.05808: done sending task result for task 0affcac9-a3a5-e081-a588-000000001102 30575 1726867618.05811: WORKER PROCESS EXITING 30575 1726867618.09062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867618.14755: done with get_vars() 30575 1726867618.14786: done getting variables 30575 1726867618.15063: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:26:58 -0400 (0:00:00.209) 0:00:53.530 ****** 30575 1726867618.15295: entering _queue_task() for managed_node3/service 30575 1726867618.16048: worker is 1 (out of 1 available) 30575 1726867618.16061: exiting _queue_task() for managed_node3/service 30575 1726867618.16075: done queuing things up, now waiting for results queue to drain 30575 1726867618.16079: waiting for pending results... 30575 1726867618.16776: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867618.17005: in run() - task 0affcac9-a3a5-e081-a588-000000001103 30575 1726867618.17086: variable 'ansible_search_path' from source: unknown 30575 1726867618.17089: variable 'ansible_search_path' from source: unknown 30575 1726867618.17092: calling self._execute() 30575 1726867618.17279: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867618.17400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867618.17411: variable 'omit' from source: magic vars 30575 1726867618.18298: variable 'ansible_distribution_major_version' from source: facts 30575 1726867618.18317: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867618.18724: variable 'network_provider' from source: set_fact 30575 1726867618.18734: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867618.18737: when evaluation is False, skipping this task 30575 1726867618.18740: _execute() done 30575 1726867618.18743: dumping result to json 30575 1726867618.18760: done dumping result, returning 30575 1726867618.18764: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000001103] 30575 1726867618.18766: sending task result for task 0affcac9-a3a5-e081-a588-000000001103 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867618.19255: no more pending results, returning what we have 30575 1726867618.19262: results queue empty 30575 1726867618.19263: checking for any_errors_fatal 30575 1726867618.19273: done checking for any_errors_fatal 30575 1726867618.19338: checking for max_fail_percentage 30575 1726867618.19341: done checking for max_fail_percentage 30575 1726867618.19343: checking to see if all hosts have failed and the running result is not ok 30575 1726867618.19344: done checking to see if all hosts have failed 30575 1726867618.19344: getting the remaining hosts for this loop 30575 1726867618.19346: done getting the remaining hosts for this loop 30575 1726867618.19350: getting the next task for host managed_node3 30575 1726867618.19365: done getting next task for host managed_node3 30575 1726867618.19370: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867618.19375: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867618.19406: getting variables 30575 1726867618.19411: in VariableManager get_vars() 30575 1726867618.19453: Calling all_inventory to load vars for managed_node3 30575 1726867618.19456: Calling groups_inventory to load vars for managed_node3 30575 1726867618.19461: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867618.19475: Calling all_plugins_play to load vars for managed_node3 30575 1726867618.19692: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867618.19699: done sending task result for task 0affcac9-a3a5-e081-a588-000000001103 30575 1726867618.19702: WORKER PROCESS EXITING 30575 1726867618.19706: Calling groups_plugins_play to load vars for managed_node3 30575 1726867618.22807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867618.27097: done with get_vars() 30575 1726867618.27270: done getting variables 30575 1726867618.27473: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:26:58 -0400 (0:00:00.122) 0:00:53.652 ****** 30575 1726867618.27526: entering _queue_task() for managed_node3/copy 30575 1726867618.28528: worker is 1 (out of 1 available) 30575 1726867618.28543: exiting _queue_task() for managed_node3/copy 30575 1726867618.28683: done queuing things up, now waiting for results queue to drain 30575 1726867618.28688: waiting for pending results... 30575 1726867618.29136: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867618.29420: in run() - task 0affcac9-a3a5-e081-a588-000000001104 30575 1726867618.29488: variable 'ansible_search_path' from source: unknown 30575 1726867618.29497: variable 'ansible_search_path' from source: unknown 30575 1726867618.29541: calling self._execute() 30575 1726867618.29883: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867618.29968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867618.30105: variable 'omit' from source: magic vars 30575 1726867618.31282: variable 'ansible_distribution_major_version' from source: facts 30575 1726867618.31344: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867618.31694: variable 'network_provider' from source: set_fact 30575 1726867618.31699: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867618.31705: when evaluation is False, skipping this task 30575 1726867618.31708: _execute() done 30575 1726867618.31788: dumping result to json 30575 1726867618.31792: done dumping result, returning 30575 1726867618.31801: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000001104] 30575 1726867618.31807: sending task result for task 0affcac9-a3a5-e081-a588-000000001104 30575 1726867618.32196: done sending task result for task 0affcac9-a3a5-e081-a588-000000001104 30575 1726867618.32200: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867618.32261: no more pending results, returning what we have 30575 1726867618.32267: results queue empty 30575 1726867618.32268: checking for any_errors_fatal 30575 1726867618.32278: done checking for any_errors_fatal 30575 1726867618.32279: checking for max_fail_percentage 30575 1726867618.32281: done checking for max_fail_percentage 30575 1726867618.32282: checking to see if all hosts have failed and the running result is not ok 30575 1726867618.32282: done checking to see if all hosts have failed 30575 1726867618.32283: getting the remaining hosts for this loop 30575 1726867618.32285: done getting the remaining hosts for this loop 30575 1726867618.32288: getting the next task for host managed_node3 30575 1726867618.32296: done getting next task for host managed_node3 30575 1726867618.32301: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867618.32306: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867618.32331: getting variables 30575 1726867618.32333: in VariableManager get_vars() 30575 1726867618.32369: Calling all_inventory to load vars for managed_node3 30575 1726867618.32372: Calling groups_inventory to load vars for managed_node3 30575 1726867618.32374: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867618.32508: Calling all_plugins_play to load vars for managed_node3 30575 1726867618.32512: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867618.32518: Calling groups_plugins_play to load vars for managed_node3 30575 1726867618.35564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867618.39295: done with get_vars() 30575 1726867618.39320: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:26:58 -0400 (0:00:00.120) 0:00:53.772 ****** 30575 1726867618.39530: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867618.40253: worker is 1 (out of 1 available) 30575 1726867618.40268: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867618.40494: done queuing things up, now waiting for results queue to drain 30575 1726867618.40496: waiting for pending results... 30575 1726867618.40941: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867618.41132: in run() - task 0affcac9-a3a5-e081-a588-000000001105 30575 1726867618.41147: variable 'ansible_search_path' from source: unknown 30575 1726867618.41150: variable 'ansible_search_path' from source: unknown 30575 1726867618.41186: calling self._execute() 30575 1726867618.41396: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867618.41400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867618.41468: variable 'omit' from source: magic vars 30575 1726867618.42695: variable 'ansible_distribution_major_version' from source: facts 30575 1726867618.42706: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867618.42712: variable 'omit' from source: magic vars 30575 1726867618.42771: variable 'omit' from source: magic vars 30575 1726867618.43250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867618.47684: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867618.47862: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867618.47902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867618.47960: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867618.47963: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867618.48154: variable 'network_provider' from source: set_fact 30575 1726867618.48383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867618.48388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867618.48391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867618.48621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867618.48636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867618.48794: variable 'omit' from source: magic vars 30575 1726867618.49051: variable 'omit' from source: magic vars 30575 1726867618.49264: variable 'network_connections' from source: include params 30575 1726867618.49274: variable 'interface' from source: play vars 30575 1726867618.49408: variable 'interface' from source: play vars 30575 1726867618.49786: variable 'omit' from source: magic vars 30575 1726867618.49790: variable '__lsr_ansible_managed' from source: task vars 30575 1726867618.49792: variable '__lsr_ansible_managed' from source: task vars 30575 1726867618.50370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867618.50634: Loaded config def from plugin (lookup/template) 30575 1726867618.50638: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867618.50641: File lookup term: get_ansible_managed.j2 30575 1726867618.50643: variable 'ansible_search_path' from source: unknown 30575 1726867618.50646: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867618.50666: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867618.50685: variable 'ansible_search_path' from source: unknown 30575 1726867618.62014: variable 'ansible_managed' from source: unknown 30575 1726867618.62185: variable 'omit' from source: magic vars 30575 1726867618.62217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867618.62240: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867618.62263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867618.62297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867618.62307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867618.62339: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867618.62342: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867618.62345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867618.62483: Set connection var ansible_pipelining to False 30575 1726867618.62486: Set connection var ansible_shell_type to sh 30575 1726867618.62488: Set connection var ansible_shell_executable to /bin/sh 30575 1726867618.62491: Set connection var ansible_timeout to 10 30575 1726867618.62493: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867618.62495: Set connection var ansible_connection to ssh 30575 1726867618.62497: variable 'ansible_shell_executable' from source: unknown 30575 1726867618.62499: variable 'ansible_connection' from source: unknown 30575 1726867618.62501: variable 'ansible_module_compression' from source: unknown 30575 1726867618.62503: variable 'ansible_shell_type' from source: unknown 30575 1726867618.62505: variable 'ansible_shell_executable' from source: unknown 30575 1726867618.62508: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867618.62557: variable 'ansible_pipelining' from source: unknown 30575 1726867618.62560: variable 'ansible_timeout' from source: unknown 30575 1726867618.62563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867618.62775: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867618.62788: variable 'omit' from source: magic vars 30575 1726867618.62791: starting attempt loop 30575 1726867618.62793: running the handler 30575 1726867618.62796: _low_level_execute_command(): starting 30575 1726867618.62798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867618.63356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867618.63368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867618.63382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867618.63396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867618.63408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867618.63418: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867618.63441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867618.63568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867618.63572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867618.63607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867618.65360: stdout chunk (state=3): >>>/root <<< 30575 1726867618.65396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867618.65439: stderr chunk (state=3): >>><<< 30575 1726867618.65504: stdout chunk (state=3): >>><<< 30575 1726867618.65527: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867618.65600: _low_level_execute_command(): starting 30575 1726867618.65608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460 `" && echo ansible-tmp-1726867618.655274-33175-1830668531460="` echo /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460 `" ) && sleep 0' 30575 1726867618.66733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867618.66736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867618.66739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867618.66741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867618.66744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867618.66929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867618.66935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867618.66973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867618.68858: stdout chunk (state=3): >>>ansible-tmp-1726867618.655274-33175-1830668531460=/root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460 <<< 30575 1726867618.69019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867618.69056: stdout chunk (state=3): >>><<< 30575 1726867618.69059: stderr chunk (state=3): >>><<< 30575 1726867618.69080: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867618.655274-33175-1830668531460=/root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867618.69292: variable 'ansible_module_compression' from source: unknown 30575 1726867618.69295: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867618.69298: variable 'ansible_facts' from source: unknown 30575 1726867618.69405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/AnsiballZ_network_connections.py 30575 1726867618.69640: Sending initial data 30575 1726867618.69643: Sent initial data (165 bytes) 30575 1726867618.70238: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867618.70429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867618.70603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867618.72082: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867618.72147: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867618.72214: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4y269fa3 /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/AnsiballZ_network_connections.py <<< 30575 1726867618.72247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/AnsiballZ_network_connections.py" <<< 30575 1726867618.72298: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4y269fa3" to remote "/root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/AnsiballZ_network_connections.py" <<< 30575 1726867618.73911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867618.73914: stderr chunk (state=3): >>><<< 30575 1726867618.73986: stdout chunk (state=3): >>><<< 30575 1726867618.73989: done transferring module to remote 30575 1726867618.73992: _low_level_execute_command(): starting 30575 1726867618.73994: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/ /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/AnsiballZ_network_connections.py && sleep 0' 30575 1726867618.75042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867618.75393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867618.75405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867618.75414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867618.75493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867618.77460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867618.77464: stdout chunk (state=3): >>><<< 30575 1726867618.77483: stderr chunk (state=3): >>><<< 30575 1726867618.77487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867618.77492: _low_level_execute_command(): starting 30575 1726867618.77511: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/AnsiballZ_network_connections.py && sleep 0' 30575 1726867618.78897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867618.79210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867618.79252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867618.79379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867618.79424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867618.79554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.07476: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867619.10290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867619.10298: stdout chunk (state=3): >>><<< 30575 1726867619.10300: stderr chunk (state=3): >>><<< 30575 1726867619.10303: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867619.10338: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867619.10395: _low_level_execute_command(): starting 30575 1726867619.10398: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867618.655274-33175-1830668531460/ > /dev/null 2>&1 && sleep 0' 30575 1726867619.10901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867619.10911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867619.10921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867619.10956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867619.10959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867619.10962: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867619.10964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.11082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867619.11085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867619.11088: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867619.11090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867619.11092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867619.11094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867619.11096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867619.11098: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867619.11100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.11102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867619.11104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867619.11141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867619.11205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.13282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867619.13286: stdout chunk (state=3): >>><<< 30575 1726867619.13288: stderr chunk (state=3): >>><<< 30575 1726867619.13291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867619.13293: handler run complete 30575 1726867619.13319: attempt loop complete, returning result 30575 1726867619.13585: _execute() done 30575 1726867619.13588: dumping result to json 30575 1726867619.13590: done dumping result, returning 30575 1726867619.13593: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-000000001105] 30575 1726867619.13595: sending task result for task 0affcac9-a3a5-e081-a588-000000001105 30575 1726867619.13674: done sending task result for task 0affcac9-a3a5-e081-a588-000000001105 30575 1726867619.13679: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219 30575 1726867619.13800: no more pending results, returning what we have 30575 1726867619.13804: results queue empty 30575 1726867619.13805: checking for any_errors_fatal 30575 1726867619.13811: done checking for any_errors_fatal 30575 1726867619.13812: checking for max_fail_percentage 30575 1726867619.13814: done checking for max_fail_percentage 30575 1726867619.13815: checking to see if all hosts have failed and the running result is not ok 30575 1726867619.13816: done checking to see if all hosts have failed 30575 1726867619.13816: getting the remaining hosts for this loop 30575 1726867619.13818: done getting the remaining hosts for this loop 30575 1726867619.13821: getting the next task for host managed_node3 30575 1726867619.13830: done getting next task for host managed_node3 30575 1726867619.13834: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867619.13839: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867619.13852: getting variables 30575 1726867619.13853: in VariableManager get_vars() 30575 1726867619.14173: Calling all_inventory to load vars for managed_node3 30575 1726867619.14176: Calling groups_inventory to load vars for managed_node3 30575 1726867619.14180: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867619.14191: Calling all_plugins_play to load vars for managed_node3 30575 1726867619.14194: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867619.14197: Calling groups_plugins_play to load vars for managed_node3 30575 1726867619.17439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867619.19135: done with get_vars() 30575 1726867619.19172: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:26:59 -0400 (0:00:00.797) 0:00:54.570 ****** 30575 1726867619.19288: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867619.20463: worker is 1 (out of 1 available) 30575 1726867619.20481: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867619.20497: done queuing things up, now waiting for results queue to drain 30575 1726867619.20498: waiting for pending results... 30575 1726867619.21286: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867619.21451: in run() - task 0affcac9-a3a5-e081-a588-000000001106 30575 1726867619.21456: variable 'ansible_search_path' from source: unknown 30575 1726867619.21459: variable 'ansible_search_path' from source: unknown 30575 1726867619.21468: calling self._execute() 30575 1726867619.21583: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.21668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.21838: variable 'omit' from source: magic vars 30575 1726867619.22630: variable 'ansible_distribution_major_version' from source: facts 30575 1726867619.22643: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867619.22887: variable 'network_state' from source: role '' defaults 30575 1726867619.22890: Evaluated conditional (network_state != {}): False 30575 1726867619.22991: when evaluation is False, skipping this task 30575 1726867619.22994: _execute() done 30575 1726867619.22997: dumping result to json 30575 1726867619.23011: done dumping result, returning 30575 1726867619.23014: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000001106] 30575 1726867619.23016: sending task result for task 0affcac9-a3a5-e081-a588-000000001106 30575 1726867619.23181: done sending task result for task 0affcac9-a3a5-e081-a588-000000001106 30575 1726867619.23187: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867619.23251: no more pending results, returning what we have 30575 1726867619.23256: results queue empty 30575 1726867619.23256: checking for any_errors_fatal 30575 1726867619.23271: done checking for any_errors_fatal 30575 1726867619.23271: checking for max_fail_percentage 30575 1726867619.23273: done checking for max_fail_percentage 30575 1726867619.23274: checking to see if all hosts have failed and the running result is not ok 30575 1726867619.23275: done checking to see if all hosts have failed 30575 1726867619.23276: getting the remaining hosts for this loop 30575 1726867619.23280: done getting the remaining hosts for this loop 30575 1726867619.23284: getting the next task for host managed_node3 30575 1726867619.23293: done getting next task for host managed_node3 30575 1726867619.23296: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867619.23302: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867619.23443: getting variables 30575 1726867619.23445: in VariableManager get_vars() 30575 1726867619.23483: Calling all_inventory to load vars for managed_node3 30575 1726867619.23485: Calling groups_inventory to load vars for managed_node3 30575 1726867619.23487: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867619.23496: Calling all_plugins_play to load vars for managed_node3 30575 1726867619.23499: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867619.23501: Calling groups_plugins_play to load vars for managed_node3 30575 1726867619.25270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867619.27045: done with get_vars() 30575 1726867619.27066: done getting variables 30575 1726867619.27151: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:26:59 -0400 (0:00:00.079) 0:00:54.649 ****** 30575 1726867619.27193: entering _queue_task() for managed_node3/debug 30575 1726867619.27798: worker is 1 (out of 1 available) 30575 1726867619.27808: exiting _queue_task() for managed_node3/debug 30575 1726867619.27819: done queuing things up, now waiting for results queue to drain 30575 1726867619.27820: waiting for pending results... 30575 1726867619.27925: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867619.28129: in run() - task 0affcac9-a3a5-e081-a588-000000001107 30575 1726867619.28166: variable 'ansible_search_path' from source: unknown 30575 1726867619.28176: variable 'ansible_search_path' from source: unknown 30575 1726867619.28228: calling self._execute() 30575 1726867619.28353: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.28369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.28398: variable 'omit' from source: magic vars 30575 1726867619.28870: variable 'ansible_distribution_major_version' from source: facts 30575 1726867619.28892: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867619.28933: variable 'omit' from source: magic vars 30575 1726867619.29037: variable 'omit' from source: magic vars 30575 1726867619.29067: variable 'omit' from source: magic vars 30575 1726867619.29129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867619.29256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867619.29265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867619.29268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867619.29270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867619.29311: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867619.29324: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.29333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.29459: Set connection var ansible_pipelining to False 30575 1726867619.29484: Set connection var ansible_shell_type to sh 30575 1726867619.29504: Set connection var ansible_shell_executable to /bin/sh 30575 1726867619.29519: Set connection var ansible_timeout to 10 30575 1726867619.29580: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867619.29583: Set connection var ansible_connection to ssh 30575 1726867619.29586: variable 'ansible_shell_executable' from source: unknown 30575 1726867619.29597: variable 'ansible_connection' from source: unknown 30575 1726867619.29606: variable 'ansible_module_compression' from source: unknown 30575 1726867619.29623: variable 'ansible_shell_type' from source: unknown 30575 1726867619.29639: variable 'ansible_shell_executable' from source: unknown 30575 1726867619.29650: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.29686: variable 'ansible_pipelining' from source: unknown 30575 1726867619.29689: variable 'ansible_timeout' from source: unknown 30575 1726867619.29692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.29847: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867619.29865: variable 'omit' from source: magic vars 30575 1726867619.29882: starting attempt loop 30575 1726867619.29886: running the handler 30575 1726867619.30124: variable '__network_connections_result' from source: set_fact 30575 1726867619.30127: handler run complete 30575 1726867619.30129: attempt loop complete, returning result 30575 1726867619.30131: _execute() done 30575 1726867619.30141: dumping result to json 30575 1726867619.30150: done dumping result, returning 30575 1726867619.30163: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-000000001107] 30575 1726867619.30172: sending task result for task 0affcac9-a3a5-e081-a588-000000001107 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219" ] } 30575 1726867619.30406: no more pending results, returning what we have 30575 1726867619.30409: results queue empty 30575 1726867619.30413: checking for any_errors_fatal 30575 1726867619.30424: done checking for any_errors_fatal 30575 1726867619.30425: checking for max_fail_percentage 30575 1726867619.30427: done checking for max_fail_percentage 30575 1726867619.30428: checking to see if all hosts have failed and the running result is not ok 30575 1726867619.30429: done checking to see if all hosts have failed 30575 1726867619.30430: getting the remaining hosts for this loop 30575 1726867619.30432: done getting the remaining hosts for this loop 30575 1726867619.30435: getting the next task for host managed_node3 30575 1726867619.30559: done getting next task for host managed_node3 30575 1726867619.30564: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867619.30570: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867619.30585: getting variables 30575 1726867619.30591: in VariableManager get_vars() 30575 1726867619.30633: Calling all_inventory to load vars for managed_node3 30575 1726867619.30636: Calling groups_inventory to load vars for managed_node3 30575 1726867619.30638: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867619.30649: Calling all_plugins_play to load vars for managed_node3 30575 1726867619.30651: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867619.30654: Calling groups_plugins_play to load vars for managed_node3 30575 1726867619.31231: done sending task result for task 0affcac9-a3a5-e081-a588-000000001107 30575 1726867619.31235: WORKER PROCESS EXITING 30575 1726867619.33057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867619.34906: done with get_vars() 30575 1726867619.34929: done getting variables 30575 1726867619.34993: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:26:59 -0400 (0:00:00.078) 0:00:54.727 ****** 30575 1726867619.35033: entering _queue_task() for managed_node3/debug 30575 1726867619.35309: worker is 1 (out of 1 available) 30575 1726867619.35325: exiting _queue_task() for managed_node3/debug 30575 1726867619.35337: done queuing things up, now waiting for results queue to drain 30575 1726867619.35339: waiting for pending results... 30575 1726867619.35603: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867619.35766: in run() - task 0affcac9-a3a5-e081-a588-000000001108 30575 1726867619.35791: variable 'ansible_search_path' from source: unknown 30575 1726867619.35805: variable 'ansible_search_path' from source: unknown 30575 1726867619.35849: calling self._execute() 30575 1726867619.35967: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.35985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.36002: variable 'omit' from source: magic vars 30575 1726867619.37085: variable 'ansible_distribution_major_version' from source: facts 30575 1726867619.37089: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867619.37092: variable 'omit' from source: magic vars 30575 1726867619.37111: variable 'omit' from source: magic vars 30575 1726867619.37150: variable 'omit' from source: magic vars 30575 1726867619.37238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867619.37281: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867619.37387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867619.37518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867619.37630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867619.37669: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867619.37672: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.37675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.38080: Set connection var ansible_pipelining to False 30575 1726867619.38087: Set connection var ansible_shell_type to sh 30575 1726867619.38123: Set connection var ansible_shell_executable to /bin/sh 30575 1726867619.38126: Set connection var ansible_timeout to 10 30575 1726867619.38132: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867619.38142: Set connection var ansible_connection to ssh 30575 1726867619.38158: variable 'ansible_shell_executable' from source: unknown 30575 1726867619.38161: variable 'ansible_connection' from source: unknown 30575 1726867619.38164: variable 'ansible_module_compression' from source: unknown 30575 1726867619.38166: variable 'ansible_shell_type' from source: unknown 30575 1726867619.38168: variable 'ansible_shell_executable' from source: unknown 30575 1726867619.38180: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.38182: variable 'ansible_pipelining' from source: unknown 30575 1726867619.38184: variable 'ansible_timeout' from source: unknown 30575 1726867619.38186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.38292: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867619.38302: variable 'omit' from source: magic vars 30575 1726867619.38307: starting attempt loop 30575 1726867619.38312: running the handler 30575 1726867619.38390: variable '__network_connections_result' from source: set_fact 30575 1726867619.38439: variable '__network_connections_result' from source: set_fact 30575 1726867619.38535: handler run complete 30575 1726867619.38552: attempt loop complete, returning result 30575 1726867619.38555: _execute() done 30575 1726867619.38558: dumping result to json 30575 1726867619.38561: done dumping result, returning 30575 1726867619.38569: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-000000001108] 30575 1726867619.38574: sending task result for task 0affcac9-a3a5-e081-a588-000000001108 30575 1726867619.38696: done sending task result for task 0affcac9-a3a5-e081-a588-000000001108 30575 1726867619.38698: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219" ] } } 30575 1726867619.38804: no more pending results, returning what we have 30575 1726867619.38807: results queue empty 30575 1726867619.38809: checking for any_errors_fatal 30575 1726867619.38813: done checking for any_errors_fatal 30575 1726867619.38814: checking for max_fail_percentage 30575 1726867619.38815: done checking for max_fail_percentage 30575 1726867619.38816: checking to see if all hosts have failed and the running result is not ok 30575 1726867619.38817: done checking to see if all hosts have failed 30575 1726867619.38817: getting the remaining hosts for this loop 30575 1726867619.38819: done getting the remaining hosts for this loop 30575 1726867619.38822: getting the next task for host managed_node3 30575 1726867619.38828: done getting next task for host managed_node3 30575 1726867619.38832: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867619.38836: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867619.38846: getting variables 30575 1726867619.38848: in VariableManager get_vars() 30575 1726867619.38925: Calling all_inventory to load vars for managed_node3 30575 1726867619.38927: Calling groups_inventory to load vars for managed_node3 30575 1726867619.38929: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867619.38935: Calling all_plugins_play to load vars for managed_node3 30575 1726867619.38937: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867619.38938: Calling groups_plugins_play to load vars for managed_node3 30575 1726867619.39988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867619.42618: done with get_vars() 30575 1726867619.42639: done getting variables 30575 1726867619.42778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:26:59 -0400 (0:00:00.077) 0:00:54.805 ****** 30575 1726867619.42813: entering _queue_task() for managed_node3/debug 30575 1726867619.43409: worker is 1 (out of 1 available) 30575 1726867619.43460: exiting _queue_task() for managed_node3/debug 30575 1726867619.43473: done queuing things up, now waiting for results queue to drain 30575 1726867619.43475: waiting for pending results... 30575 1726867619.43926: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867619.44065: in run() - task 0affcac9-a3a5-e081-a588-000000001109 30575 1726867619.44132: variable 'ansible_search_path' from source: unknown 30575 1726867619.44167: variable 'ansible_search_path' from source: unknown 30575 1726867619.44644: calling self._execute() 30575 1726867619.44848: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.44928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.44966: variable 'omit' from source: magic vars 30575 1726867619.45926: variable 'ansible_distribution_major_version' from source: facts 30575 1726867619.45945: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867619.46190: variable 'network_state' from source: role '' defaults 30575 1726867619.46194: Evaluated conditional (network_state != {}): False 30575 1726867619.46196: when evaluation is False, skipping this task 30575 1726867619.46198: _execute() done 30575 1726867619.46200: dumping result to json 30575 1726867619.46202: done dumping result, returning 30575 1726867619.46205: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000001109] 30575 1726867619.46207: sending task result for task 0affcac9-a3a5-e081-a588-000000001109 30575 1726867619.46896: done sending task result for task 0affcac9-a3a5-e081-a588-000000001109 30575 1726867619.46900: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867619.46958: no more pending results, returning what we have 30575 1726867619.46962: results queue empty 30575 1726867619.46963: checking for any_errors_fatal 30575 1726867619.46973: done checking for any_errors_fatal 30575 1726867619.46974: checking for max_fail_percentage 30575 1726867619.46976: done checking for max_fail_percentage 30575 1726867619.46980: checking to see if all hosts have failed and the running result is not ok 30575 1726867619.46981: done checking to see if all hosts have failed 30575 1726867619.46981: getting the remaining hosts for this loop 30575 1726867619.46987: done getting the remaining hosts for this loop 30575 1726867619.46991: getting the next task for host managed_node3 30575 1726867619.47076: done getting next task for host managed_node3 30575 1726867619.47105: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867619.47168: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867619.47200: getting variables 30575 1726867619.47203: in VariableManager get_vars() 30575 1726867619.47410: Calling all_inventory to load vars for managed_node3 30575 1726867619.47413: Calling groups_inventory to load vars for managed_node3 30575 1726867619.47415: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867619.47426: Calling all_plugins_play to load vars for managed_node3 30575 1726867619.47430: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867619.47433: Calling groups_plugins_play to load vars for managed_node3 30575 1726867619.49765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867619.52531: done with get_vars() 30575 1726867619.52555: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:26:59 -0400 (0:00:00.098) 0:00:54.904 ****** 30575 1726867619.52676: entering _queue_task() for managed_node3/ping 30575 1726867619.53221: worker is 1 (out of 1 available) 30575 1726867619.53238: exiting _queue_task() for managed_node3/ping 30575 1726867619.53258: done queuing things up, now waiting for results queue to drain 30575 1726867619.53261: waiting for pending results... 30575 1726867619.53694: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867619.53815: in run() - task 0affcac9-a3a5-e081-a588-00000000110a 30575 1726867619.53820: variable 'ansible_search_path' from source: unknown 30575 1726867619.53826: variable 'ansible_search_path' from source: unknown 30575 1726867619.53867: calling self._execute() 30575 1726867619.54004: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.54008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.54011: variable 'omit' from source: magic vars 30575 1726867619.54623: variable 'ansible_distribution_major_version' from source: facts 30575 1726867619.54641: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867619.54656: variable 'omit' from source: magic vars 30575 1726867619.54770: variable 'omit' from source: magic vars 30575 1726867619.54826: variable 'omit' from source: magic vars 30575 1726867619.55095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867619.55098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867619.55101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867619.55103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867619.55105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867619.55107: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867619.55108: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.55110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.55255: Set connection var ansible_pipelining to False 30575 1726867619.55265: Set connection var ansible_shell_type to sh 30575 1726867619.55275: Set connection var ansible_shell_executable to /bin/sh 30575 1726867619.55288: Set connection var ansible_timeout to 10 30575 1726867619.55298: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867619.55309: Set connection var ansible_connection to ssh 30575 1726867619.55349: variable 'ansible_shell_executable' from source: unknown 30575 1726867619.55356: variable 'ansible_connection' from source: unknown 30575 1726867619.55363: variable 'ansible_module_compression' from source: unknown 30575 1726867619.55368: variable 'ansible_shell_type' from source: unknown 30575 1726867619.55374: variable 'ansible_shell_executable' from source: unknown 30575 1726867619.55419: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867619.55422: variable 'ansible_pipelining' from source: unknown 30575 1726867619.55424: variable 'ansible_timeout' from source: unknown 30575 1726867619.55426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867619.55797: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867619.55802: variable 'omit' from source: magic vars 30575 1726867619.55806: starting attempt loop 30575 1726867619.55814: running the handler 30575 1726867619.55855: _low_level_execute_command(): starting 30575 1726867619.55872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867619.57134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.57140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.57143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867619.57232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.58907: stdout chunk (state=3): >>>/root <<< 30575 1726867619.59027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867619.59065: stderr chunk (state=3): >>><<< 30575 1726867619.59069: stdout chunk (state=3): >>><<< 30575 1726867619.59082: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867619.59095: _low_level_execute_command(): starting 30575 1726867619.59100: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034 `" && echo ansible-tmp-1726867619.590826-33225-106594085890034="` echo /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034 `" ) && sleep 0' 30575 1726867619.59621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867619.59624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.59627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867619.59636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.59685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867619.59689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867619.59740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.61654: stdout chunk (state=3): >>>ansible-tmp-1726867619.590826-33225-106594085890034=/root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034 <<< 30575 1726867619.61760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867619.61782: stderr chunk (state=3): >>><<< 30575 1726867619.61785: stdout chunk (state=3): >>><<< 30575 1726867619.61800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867619.590826-33225-106594085890034=/root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867619.61837: variable 'ansible_module_compression' from source: unknown 30575 1726867619.61869: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867619.61895: variable 'ansible_facts' from source: unknown 30575 1726867619.61949: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/AnsiballZ_ping.py 30575 1726867619.62036: Sending initial data 30575 1726867619.62039: Sent initial data (152 bytes) 30575 1726867619.62589: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867619.62592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.62595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867619.62597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.62636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867619.62646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867619.62702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.64268: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867619.64321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867619.64393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7op_xexf /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/AnsiballZ_ping.py <<< 30575 1726867619.64400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/AnsiballZ_ping.py" <<< 30575 1726867619.64465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7op_xexf" to remote "/root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/AnsiballZ_ping.py" <<< 30575 1726867619.65415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867619.65456: stderr chunk (state=3): >>><<< 30575 1726867619.65463: stdout chunk (state=3): >>><<< 30575 1726867619.65565: done transferring module to remote 30575 1726867619.65569: _low_level_execute_command(): starting 30575 1726867619.65571: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/ /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/AnsiballZ_ping.py && sleep 0' 30575 1726867619.66165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867619.66182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867619.66201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867619.66219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867619.66235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867619.66245: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867619.66319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.66353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867619.66372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867619.66394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867619.66455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.68193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867619.68216: stderr chunk (state=3): >>><<< 30575 1726867619.68222: stdout chunk (state=3): >>><<< 30575 1726867619.68236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867619.68239: _low_level_execute_command(): starting 30575 1726867619.68242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/AnsiballZ_ping.py && sleep 0' 30575 1726867619.68650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867619.68653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.68656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867619.68658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867619.68659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867619.68708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867619.68718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867619.68760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.83810: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867619.85036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867619.85040: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867619.85188: stderr chunk (state=3): >>><<< 30575 1726867619.85295: stdout chunk (state=3): >>><<< 30575 1726867619.85317: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867619.85345: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867619.85349: _low_level_execute_command(): starting 30575 1726867619.85354: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867619.590826-33225-106594085890034/ > /dev/null 2>&1 && sleep 0' 30575 1726867619.86695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867619.86795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867619.86874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867619.88727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867619.88731: stdout chunk (state=3): >>><<< 30575 1726867619.88738: stderr chunk (state=3): >>><<< 30575 1726867619.88754: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867619.88762: handler run complete 30575 1726867619.88788: attempt loop complete, returning result 30575 1726867619.88791: _execute() done 30575 1726867619.88896: dumping result to json 30575 1726867619.88899: done dumping result, returning 30575 1726867619.88902: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-00000000110a] 30575 1726867619.88908: sending task result for task 0affcac9-a3a5-e081-a588-00000000110a ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867619.89186: no more pending results, returning what we have 30575 1726867619.89190: results queue empty 30575 1726867619.89190: checking for any_errors_fatal 30575 1726867619.89198: done checking for any_errors_fatal 30575 1726867619.89199: checking for max_fail_percentage 30575 1726867619.89200: done checking for max_fail_percentage 30575 1726867619.89201: checking to see if all hosts have failed and the running result is not ok 30575 1726867619.89202: done checking to see if all hosts have failed 30575 1726867619.89203: getting the remaining hosts for this loop 30575 1726867619.89206: done getting the remaining hosts for this loop 30575 1726867619.89209: getting the next task for host managed_node3 30575 1726867619.89223: done getting next task for host managed_node3 30575 1726867619.89225: ^ task is: TASK: meta (role_complete) 30575 1726867619.89231: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867619.89243: getting variables 30575 1726867619.89245: in VariableManager get_vars() 30575 1726867619.89397: Calling all_inventory to load vars for managed_node3 30575 1726867619.89400: Calling groups_inventory to load vars for managed_node3 30575 1726867619.89403: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867619.89410: done sending task result for task 0affcac9-a3a5-e081-a588-00000000110a 30575 1726867619.89413: WORKER PROCESS EXITING 30575 1726867619.89512: Calling all_plugins_play to load vars for managed_node3 30575 1726867619.89519: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867619.89523: Calling groups_plugins_play to load vars for managed_node3 30575 1726867619.93528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867619.95352: done with get_vars() 30575 1726867619.95381: done getting variables 30575 1726867619.95466: done queuing things up, now waiting for results queue to drain 30575 1726867619.95469: results queue empty 30575 1726867619.95470: checking for any_errors_fatal 30575 1726867619.95473: done checking for any_errors_fatal 30575 1726867619.95474: checking for max_fail_percentage 30575 1726867619.95475: done checking for max_fail_percentage 30575 1726867619.95475: checking to see if all hosts have failed and the running result is not ok 30575 1726867619.95476: done checking to see if all hosts have failed 30575 1726867619.95617: getting the remaining hosts for this loop 30575 1726867619.95619: done getting the remaining hosts for this loop 30575 1726867619.95623: getting the next task for host managed_node3 30575 1726867619.95628: done getting next task for host managed_node3 30575 1726867619.95630: ^ task is: TASK: Show result 30575 1726867619.95633: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867619.95635: getting variables 30575 1726867619.95636: in VariableManager get_vars() 30575 1726867619.95651: Calling all_inventory to load vars for managed_node3 30575 1726867619.95653: Calling groups_inventory to load vars for managed_node3 30575 1726867619.95656: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867619.95663: Calling all_plugins_play to load vars for managed_node3 30575 1726867619.95666: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867619.95669: Calling groups_plugins_play to load vars for managed_node3 30575 1726867619.98231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.01892: done with get_vars() 30575 1726867620.01929: done getting variables 30575 1726867620.01969: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 17:27:00 -0400 (0:00:00.493) 0:00:55.397 ****** 30575 1726867620.02000: entering _queue_task() for managed_node3/debug 30575 1726867620.02566: worker is 1 (out of 1 available) 30575 1726867620.02582: exiting _queue_task() for managed_node3/debug 30575 1726867620.02595: done queuing things up, now waiting for results queue to drain 30575 1726867620.02596: waiting for pending results... 30575 1726867620.02947: running TaskExecutor() for managed_node3/TASK: Show result 30575 1726867620.03115: in run() - task 0affcac9-a3a5-e081-a588-000000001090 30575 1726867620.03121: variable 'ansible_search_path' from source: unknown 30575 1726867620.03124: variable 'ansible_search_path' from source: unknown 30575 1726867620.03152: calling self._execute() 30575 1726867620.03286: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.03310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.03382: variable 'omit' from source: magic vars 30575 1726867620.03943: variable 'ansible_distribution_major_version' from source: facts 30575 1726867620.03962: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867620.03973: variable 'omit' from source: magic vars 30575 1726867620.04041: variable 'omit' from source: magic vars 30575 1726867620.04081: variable 'omit' from source: magic vars 30575 1726867620.04140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867620.04191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867620.04287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867620.04291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867620.04293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867620.04327: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867620.04336: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.04347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.04475: Set connection var ansible_pipelining to False 30575 1726867620.04486: Set connection var ansible_shell_type to sh 30575 1726867620.04496: Set connection var ansible_shell_executable to /bin/sh 30575 1726867620.04505: Set connection var ansible_timeout to 10 30575 1726867620.04514: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867620.04527: Set connection var ansible_connection to ssh 30575 1726867620.04650: variable 'ansible_shell_executable' from source: unknown 30575 1726867620.04653: variable 'ansible_connection' from source: unknown 30575 1726867620.04656: variable 'ansible_module_compression' from source: unknown 30575 1726867620.04658: variable 'ansible_shell_type' from source: unknown 30575 1726867620.04660: variable 'ansible_shell_executable' from source: unknown 30575 1726867620.04662: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.04664: variable 'ansible_pipelining' from source: unknown 30575 1726867620.04666: variable 'ansible_timeout' from source: unknown 30575 1726867620.04668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.04758: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867620.04776: variable 'omit' from source: magic vars 30575 1726867620.04812: starting attempt loop 30575 1726867620.04818: running the handler 30575 1726867620.04892: variable '__network_connections_result' from source: set_fact 30575 1726867620.05593: variable '__network_connections_result' from source: set_fact 30575 1726867620.05596: handler run complete 30575 1726867620.05598: attempt loop complete, returning result 30575 1726867620.05600: _execute() done 30575 1726867620.05602: dumping result to json 30575 1726867620.05608: done dumping result, returning 30575 1726867620.05611: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-e081-a588-000000001090] 30575 1726867620.05613: sending task result for task 0affcac9-a3a5-e081-a588-000000001090 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219" ] } } 30575 1726867620.06004: no more pending results, returning what we have 30575 1726867620.06008: results queue empty 30575 1726867620.06009: checking for any_errors_fatal 30575 1726867620.06012: done checking for any_errors_fatal 30575 1726867620.06012: checking for max_fail_percentage 30575 1726867620.06016: done checking for max_fail_percentage 30575 1726867620.06017: checking to see if all hosts have failed and the running result is not ok 30575 1726867620.06018: done checking to see if all hosts have failed 30575 1726867620.06019: getting the remaining hosts for this loop 30575 1726867620.06021: done getting the remaining hosts for this loop 30575 1726867620.06027: getting the next task for host managed_node3 30575 1726867620.06040: done getting next task for host managed_node3 30575 1726867620.06044: ^ task is: TASK: Include network role 30575 1726867620.06051: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867620.06055: getting variables 30575 1726867620.06057: in VariableManager get_vars() 30575 1726867620.06409: Calling all_inventory to load vars for managed_node3 30575 1726867620.06412: Calling groups_inventory to load vars for managed_node3 30575 1726867620.06419: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.06434: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.06439: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.06443: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.07183: done sending task result for task 0affcac9-a3a5-e081-a588-000000001090 30575 1726867620.07191: WORKER PROCESS EXITING 30575 1726867620.19134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.22797: done with get_vars() 30575 1726867620.22832: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 17:27:00 -0400 (0:00:00.210) 0:00:55.607 ****** 30575 1726867620.23016: entering _queue_task() for managed_node3/include_role 30575 1726867620.23843: worker is 1 (out of 1 available) 30575 1726867620.23857: exiting _queue_task() for managed_node3/include_role 30575 1726867620.23871: done queuing things up, now waiting for results queue to drain 30575 1726867620.23873: waiting for pending results... 30575 1726867620.24714: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867620.25486: in run() - task 0affcac9-a3a5-e081-a588-000000001094 30575 1726867620.25492: variable 'ansible_search_path' from source: unknown 30575 1726867620.25496: variable 'ansible_search_path' from source: unknown 30575 1726867620.25622: calling self._execute() 30575 1726867620.26055: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.26060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.26063: variable 'omit' from source: magic vars 30575 1726867620.27082: variable 'ansible_distribution_major_version' from source: facts 30575 1726867620.27441: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867620.27446: _execute() done 30575 1726867620.27449: dumping result to json 30575 1726867620.27452: done dumping result, returning 30575 1726867620.27456: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-000000001094] 30575 1726867620.27459: sending task result for task 0affcac9-a3a5-e081-a588-000000001094 30575 1726867620.27695: no more pending results, returning what we have 30575 1726867620.27702: in VariableManager get_vars() 30575 1726867620.27749: Calling all_inventory to load vars for managed_node3 30575 1726867620.27752: Calling groups_inventory to load vars for managed_node3 30575 1726867620.27756: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.27775: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.27782: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.27786: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.28940: done sending task result for task 0affcac9-a3a5-e081-a588-000000001094 30575 1726867620.28943: WORKER PROCESS EXITING 30575 1726867620.31778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.35663: done with get_vars() 30575 1726867620.36096: variable 'ansible_search_path' from source: unknown 30575 1726867620.36097: variable 'ansible_search_path' from source: unknown 30575 1726867620.36257: variable 'omit' from source: magic vars 30575 1726867620.36719: variable 'omit' from source: magic vars 30575 1726867620.36738: variable 'omit' from source: magic vars 30575 1726867620.36742: we have included files to process 30575 1726867620.36743: generating all_blocks data 30575 1726867620.36747: done generating all_blocks data 30575 1726867620.36752: processing included file: fedora.linux_system_roles.network 30575 1726867620.36776: in VariableManager get_vars() 30575 1726867620.36884: done with get_vars() 30575 1726867620.37035: in VariableManager get_vars() 30575 1726867620.37055: done with get_vars() 30575 1726867620.37254: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867620.37499: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867620.37914: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867620.39164: in VariableManager get_vars() 30575 1726867620.39284: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867620.44244: iterating over new_blocks loaded from include file 30575 1726867620.44247: in VariableManager get_vars() 30575 1726867620.44267: done with get_vars() 30575 1726867620.44269: filtering new block on tags 30575 1726867620.44570: done filtering new block on tags 30575 1726867620.44574: in VariableManager get_vars() 30575 1726867620.44593: done with get_vars() 30575 1726867620.44595: filtering new block on tags 30575 1726867620.44610: done filtering new block on tags 30575 1726867620.44612: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867620.44622: extending task lists for all hosts with included blocks 30575 1726867620.45026: done extending task lists 30575 1726867620.45027: done processing included files 30575 1726867620.45028: results queue empty 30575 1726867620.45029: checking for any_errors_fatal 30575 1726867620.45035: done checking for any_errors_fatal 30575 1726867620.45036: checking for max_fail_percentage 30575 1726867620.45037: done checking for max_fail_percentage 30575 1726867620.45038: checking to see if all hosts have failed and the running result is not ok 30575 1726867620.45039: done checking to see if all hosts have failed 30575 1726867620.45040: getting the remaining hosts for this loop 30575 1726867620.45041: done getting the remaining hosts for this loop 30575 1726867620.45044: getting the next task for host managed_node3 30575 1726867620.45049: done getting next task for host managed_node3 30575 1726867620.45052: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867620.45055: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867620.45067: getting variables 30575 1726867620.45068: in VariableManager get_vars() 30575 1726867620.45084: Calling all_inventory to load vars for managed_node3 30575 1726867620.45087: Calling groups_inventory to load vars for managed_node3 30575 1726867620.45089: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.45322: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.45326: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.45329: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.47307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.51050: done with get_vars() 30575 1726867620.51385: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:00 -0400 (0:00:00.284) 0:00:55.892 ****** 30575 1726867620.51480: entering _queue_task() for managed_node3/include_tasks 30575 1726867620.52651: worker is 1 (out of 1 available) 30575 1726867620.52664: exiting _queue_task() for managed_node3/include_tasks 30575 1726867620.52742: done queuing things up, now waiting for results queue to drain 30575 1726867620.52745: waiting for pending results... 30575 1726867620.53134: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867620.53451: in run() - task 0affcac9-a3a5-e081-a588-00000000127a 30575 1726867620.53473: variable 'ansible_search_path' from source: unknown 30575 1726867620.53582: variable 'ansible_search_path' from source: unknown 30575 1726867620.53596: calling self._execute() 30575 1726867620.53813: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.53827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.53845: variable 'omit' from source: magic vars 30575 1726867620.54716: variable 'ansible_distribution_major_version' from source: facts 30575 1726867620.54742: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867620.54754: _execute() done 30575 1726867620.54763: dumping result to json 30575 1726867620.54771: done dumping result, returning 30575 1726867620.54846: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-00000000127a] 30575 1726867620.54850: sending task result for task 0affcac9-a3a5-e081-a588-00000000127a 30575 1726867620.54944: done sending task result for task 0affcac9-a3a5-e081-a588-00000000127a 30575 1726867620.55064: WORKER PROCESS EXITING 30575 1726867620.55122: no more pending results, returning what we have 30575 1726867620.55128: in VariableManager get_vars() 30575 1726867620.55172: Calling all_inventory to load vars for managed_node3 30575 1726867620.55175: Calling groups_inventory to load vars for managed_node3 30575 1726867620.55179: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.55189: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.55191: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.55193: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.58920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.62359: done with get_vars() 30575 1726867620.62394: variable 'ansible_search_path' from source: unknown 30575 1726867620.62396: variable 'ansible_search_path' from source: unknown 30575 1726867620.62559: we have included files to process 30575 1726867620.62561: generating all_blocks data 30575 1726867620.62563: done generating all_blocks data 30575 1726867620.62565: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867620.62566: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867620.62569: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867620.63997: done processing included file 30575 1726867620.64000: iterating over new_blocks loaded from include file 30575 1726867620.64001: in VariableManager get_vars() 30575 1726867620.64033: done with get_vars() 30575 1726867620.64035: filtering new block on tags 30575 1726867620.64293: done filtering new block on tags 30575 1726867620.64296: in VariableManager get_vars() 30575 1726867620.64325: done with get_vars() 30575 1726867620.64328: filtering new block on tags 30575 1726867620.64382: done filtering new block on tags 30575 1726867620.64385: in VariableManager get_vars() 30575 1726867620.64405: done with get_vars() 30575 1726867620.64407: filtering new block on tags 30575 1726867620.64444: done filtering new block on tags 30575 1726867620.64446: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867620.64451: extending task lists for all hosts with included blocks 30575 1726867620.68356: done extending task lists 30575 1726867620.68358: done processing included files 30575 1726867620.68359: results queue empty 30575 1726867620.68359: checking for any_errors_fatal 30575 1726867620.68362: done checking for any_errors_fatal 30575 1726867620.68363: checking for max_fail_percentage 30575 1726867620.68364: done checking for max_fail_percentage 30575 1726867620.68365: checking to see if all hosts have failed and the running result is not ok 30575 1726867620.68366: done checking to see if all hosts have failed 30575 1726867620.68367: getting the remaining hosts for this loop 30575 1726867620.68368: done getting the remaining hosts for this loop 30575 1726867620.68371: getting the next task for host managed_node3 30575 1726867620.68376: done getting next task for host managed_node3 30575 1726867620.68381: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867620.68385: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867620.68484: getting variables 30575 1726867620.68486: in VariableManager get_vars() 30575 1726867620.68508: Calling all_inventory to load vars for managed_node3 30575 1726867620.68511: Calling groups_inventory to load vars for managed_node3 30575 1726867620.68513: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.68521: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.68525: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.68528: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.71133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.74565: done with get_vars() 30575 1726867620.74682: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:00 -0400 (0:00:00.233) 0:00:56.125 ****** 30575 1726867620.74795: entering _queue_task() for managed_node3/setup 30575 1726867620.75207: worker is 1 (out of 1 available) 30575 1726867620.75218: exiting _queue_task() for managed_node3/setup 30575 1726867620.75232: done queuing things up, now waiting for results queue to drain 30575 1726867620.75233: waiting for pending results... 30575 1726867620.75514: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867620.75702: in run() - task 0affcac9-a3a5-e081-a588-0000000012d1 30575 1726867620.75728: variable 'ansible_search_path' from source: unknown 30575 1726867620.75736: variable 'ansible_search_path' from source: unknown 30575 1726867620.75802: calling self._execute() 30575 1726867620.75875: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.75889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.75910: variable 'omit' from source: magic vars 30575 1726867620.76345: variable 'ansible_distribution_major_version' from source: facts 30575 1726867620.76349: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867620.76579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867620.79571: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867620.79657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867620.79782: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867620.79786: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867620.79788: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867620.79866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867620.79905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867620.79948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867620.79997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867620.80017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867620.80085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867620.80113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867620.80152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867620.80198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867620.80216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867620.80387: variable '__network_required_facts' from source: role '' defaults 30575 1726867620.80467: variable 'ansible_facts' from source: unknown 30575 1726867620.81182: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867620.81191: when evaluation is False, skipping this task 30575 1726867620.81199: _execute() done 30575 1726867620.81207: dumping result to json 30575 1726867620.81215: done dumping result, returning 30575 1726867620.81236: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-0000000012d1] 30575 1726867620.81248: sending task result for task 0affcac9-a3a5-e081-a588-0000000012d1 30575 1726867620.81467: done sending task result for task 0affcac9-a3a5-e081-a588-0000000012d1 30575 1726867620.81470: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867620.81518: no more pending results, returning what we have 30575 1726867620.81522: results queue empty 30575 1726867620.81523: checking for any_errors_fatal 30575 1726867620.81525: done checking for any_errors_fatal 30575 1726867620.81526: checking for max_fail_percentage 30575 1726867620.81528: done checking for max_fail_percentage 30575 1726867620.81529: checking to see if all hosts have failed and the running result is not ok 30575 1726867620.81530: done checking to see if all hosts have failed 30575 1726867620.81531: getting the remaining hosts for this loop 30575 1726867620.81532: done getting the remaining hosts for this loop 30575 1726867620.81537: getting the next task for host managed_node3 30575 1726867620.81557: done getting next task for host managed_node3 30575 1726867620.81561: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867620.81567: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867620.81595: getting variables 30575 1726867620.81597: in VariableManager get_vars() 30575 1726867620.81638: Calling all_inventory to load vars for managed_node3 30575 1726867620.81641: Calling groups_inventory to load vars for managed_node3 30575 1726867620.81644: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.81772: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.81779: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.81789: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.83425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.85881: done with get_vars() 30575 1726867620.85905: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:00 -0400 (0:00:00.111) 0:00:56.237 ****** 30575 1726867620.85991: entering _queue_task() for managed_node3/stat 30575 1726867620.86264: worker is 1 (out of 1 available) 30575 1726867620.86281: exiting _queue_task() for managed_node3/stat 30575 1726867620.86296: done queuing things up, now waiting for results queue to drain 30575 1726867620.86298: waiting for pending results... 30575 1726867620.86490: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867620.86591: in run() - task 0affcac9-a3a5-e081-a588-0000000012d3 30575 1726867620.86602: variable 'ansible_search_path' from source: unknown 30575 1726867620.86605: variable 'ansible_search_path' from source: unknown 30575 1726867620.86644: calling self._execute() 30575 1726867620.86713: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.86721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.86726: variable 'omit' from source: magic vars 30575 1726867620.87017: variable 'ansible_distribution_major_version' from source: facts 30575 1726867620.87032: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867620.87240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867620.87506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867620.87548: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867620.87580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867620.87614: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867620.87738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867620.87762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867620.87788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867620.87812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867620.87976: variable '__network_is_ostree' from source: set_fact 30575 1726867620.87984: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867620.88031: when evaluation is False, skipping this task 30575 1726867620.88035: _execute() done 30575 1726867620.88050: dumping result to json 30575 1726867620.88057: done dumping result, returning 30575 1726867620.88061: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-0000000012d3] 30575 1726867620.88064: sending task result for task 0affcac9-a3a5-e081-a588-0000000012d3 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867620.88531: no more pending results, returning what we have 30575 1726867620.88536: results queue empty 30575 1726867620.88536: checking for any_errors_fatal 30575 1726867620.88544: done checking for any_errors_fatal 30575 1726867620.88545: checking for max_fail_percentage 30575 1726867620.88546: done checking for max_fail_percentage 30575 1726867620.88547: checking to see if all hosts have failed and the running result is not ok 30575 1726867620.88548: done checking to see if all hosts have failed 30575 1726867620.88549: getting the remaining hosts for this loop 30575 1726867620.88551: done getting the remaining hosts for this loop 30575 1726867620.88555: getting the next task for host managed_node3 30575 1726867620.88564: done getting next task for host managed_node3 30575 1726867620.88567: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867620.88573: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867620.88607: getting variables 30575 1726867620.88609: in VariableManager get_vars() 30575 1726867620.88651: Calling all_inventory to load vars for managed_node3 30575 1726867620.88653: Calling groups_inventory to load vars for managed_node3 30575 1726867620.88655: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.88665: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.88667: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.88670: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.89190: done sending task result for task 0affcac9-a3a5-e081-a588-0000000012d3 30575 1726867620.89194: WORKER PROCESS EXITING 30575 1726867620.90870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.91866: done with get_vars() 30575 1726867620.91891: done getting variables 30575 1726867620.91942: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:00 -0400 (0:00:00.059) 0:00:56.297 ****** 30575 1726867620.91972: entering _queue_task() for managed_node3/set_fact 30575 1726867620.92258: worker is 1 (out of 1 available) 30575 1726867620.92272: exiting _queue_task() for managed_node3/set_fact 30575 1726867620.92290: done queuing things up, now waiting for results queue to drain 30575 1726867620.92291: waiting for pending results... 30575 1726867620.92485: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867620.92591: in run() - task 0affcac9-a3a5-e081-a588-0000000012d4 30575 1726867620.92604: variable 'ansible_search_path' from source: unknown 30575 1726867620.92608: variable 'ansible_search_path' from source: unknown 30575 1726867620.92642: calling self._execute() 30575 1726867620.92718: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.92722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.92728: variable 'omit' from source: magic vars 30575 1726867620.93331: variable 'ansible_distribution_major_version' from source: facts 30575 1726867620.93335: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867620.93338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867620.94035: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867620.94126: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867620.94169: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867620.94210: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867620.94351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867620.94368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867620.94405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867620.94460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867620.94552: variable '__network_is_ostree' from source: set_fact 30575 1726867620.94559: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867620.94565: when evaluation is False, skipping this task 30575 1726867620.94610: _execute() done 30575 1726867620.94613: dumping result to json 30575 1726867620.94616: done dumping result, returning 30575 1726867620.94619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-0000000012d4] 30575 1726867620.94621: sending task result for task 0affcac9-a3a5-e081-a588-0000000012d4 30575 1726867620.94694: done sending task result for task 0affcac9-a3a5-e081-a588-0000000012d4 30575 1726867620.94697: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867620.94741: no more pending results, returning what we have 30575 1726867620.94745: results queue empty 30575 1726867620.94745: checking for any_errors_fatal 30575 1726867620.94753: done checking for any_errors_fatal 30575 1726867620.94753: checking for max_fail_percentage 30575 1726867620.94755: done checking for max_fail_percentage 30575 1726867620.94756: checking to see if all hosts have failed and the running result is not ok 30575 1726867620.94757: done checking to see if all hosts have failed 30575 1726867620.94757: getting the remaining hosts for this loop 30575 1726867620.94759: done getting the remaining hosts for this loop 30575 1726867620.94762: getting the next task for host managed_node3 30575 1726867620.94773: done getting next task for host managed_node3 30575 1726867620.94778: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867620.94784: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867620.94809: getting variables 30575 1726867620.94810: in VariableManager get_vars() 30575 1726867620.94849: Calling all_inventory to load vars for managed_node3 30575 1726867620.94851: Calling groups_inventory to load vars for managed_node3 30575 1726867620.94853: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867620.94864: Calling all_plugins_play to load vars for managed_node3 30575 1726867620.94866: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867620.94869: Calling groups_plugins_play to load vars for managed_node3 30575 1726867620.95692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867620.96574: done with get_vars() 30575 1726867620.96591: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:00 -0400 (0:00:00.046) 0:00:56.344 ****** 30575 1726867620.96657: entering _queue_task() for managed_node3/service_facts 30575 1726867620.96872: worker is 1 (out of 1 available) 30575 1726867620.96889: exiting _queue_task() for managed_node3/service_facts 30575 1726867620.96903: done queuing things up, now waiting for results queue to drain 30575 1726867620.96905: waiting for pending results... 30575 1726867620.97084: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867620.97169: in run() - task 0affcac9-a3a5-e081-a588-0000000012d6 30575 1726867620.97183: variable 'ansible_search_path' from source: unknown 30575 1726867620.97187: variable 'ansible_search_path' from source: unknown 30575 1726867620.97214: calling self._execute() 30575 1726867620.97288: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.97292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.97302: variable 'omit' from source: magic vars 30575 1726867620.97581: variable 'ansible_distribution_major_version' from source: facts 30575 1726867620.97591: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867620.97596: variable 'omit' from source: magic vars 30575 1726867620.97647: variable 'omit' from source: magic vars 30575 1726867620.97669: variable 'omit' from source: magic vars 30575 1726867620.97703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867620.97732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867620.97748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867620.97761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867620.97771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867620.97800: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867620.97804: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.97806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.97874: Set connection var ansible_pipelining to False 30575 1726867620.97879: Set connection var ansible_shell_type to sh 30575 1726867620.97883: Set connection var ansible_shell_executable to /bin/sh 30575 1726867620.97892: Set connection var ansible_timeout to 10 30575 1726867620.97895: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867620.97905: Set connection var ansible_connection to ssh 30575 1726867620.97923: variable 'ansible_shell_executable' from source: unknown 30575 1726867620.97926: variable 'ansible_connection' from source: unknown 30575 1726867620.97929: variable 'ansible_module_compression' from source: unknown 30575 1726867620.97931: variable 'ansible_shell_type' from source: unknown 30575 1726867620.97933: variable 'ansible_shell_executable' from source: unknown 30575 1726867620.97936: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867620.97938: variable 'ansible_pipelining' from source: unknown 30575 1726867620.97940: variable 'ansible_timeout' from source: unknown 30575 1726867620.97944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867620.98084: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867620.98093: variable 'omit' from source: magic vars 30575 1726867620.98098: starting attempt loop 30575 1726867620.98101: running the handler 30575 1726867620.98114: _low_level_execute_command(): starting 30575 1726867620.98124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867620.98626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867620.98630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867620.98633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867620.98635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867620.98673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867620.98676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867620.98692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867620.98747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867621.00466: stdout chunk (state=3): >>>/root <<< 30575 1726867621.00569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867621.00597: stderr chunk (state=3): >>><<< 30575 1726867621.00600: stdout chunk (state=3): >>><<< 30575 1726867621.00621: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867621.00632: _low_level_execute_command(): starting 30575 1726867621.00637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673 `" && echo ansible-tmp-1726867621.0061939-33295-138455682369673="` echo /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673 `" ) && sleep 0' 30575 1726867621.01069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867621.01072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867621.01075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.01088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867621.01090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.01142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867621.01144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867621.01187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867621.03095: stdout chunk (state=3): >>>ansible-tmp-1726867621.0061939-33295-138455682369673=/root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673 <<< 30575 1726867621.03202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867621.03224: stderr chunk (state=3): >>><<< 30575 1726867621.03227: stdout chunk (state=3): >>><<< 30575 1726867621.03243: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867621.0061939-33295-138455682369673=/root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867621.03280: variable 'ansible_module_compression' from source: unknown 30575 1726867621.03314: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867621.03352: variable 'ansible_facts' from source: unknown 30575 1726867621.03404: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/AnsiballZ_service_facts.py 30575 1726867621.03501: Sending initial data 30575 1726867621.03504: Sent initial data (162 bytes) 30575 1726867621.03935: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867621.03938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867621.03940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.03944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867621.03946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867621.03948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.03997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867621.04005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867621.04045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867621.05590: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867621.05593: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867621.05633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867621.05679: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmplpib4ui7 /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/AnsiballZ_service_facts.py <<< 30575 1726867621.05685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/AnsiballZ_service_facts.py" <<< 30575 1726867621.05729: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmplpib4ui7" to remote "/root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/AnsiballZ_service_facts.py" <<< 30575 1726867621.05731: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/AnsiballZ_service_facts.py" <<< 30575 1726867621.06287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867621.06318: stderr chunk (state=3): >>><<< 30575 1726867621.06324: stdout chunk (state=3): >>><<< 30575 1726867621.06347: done transferring module to remote 30575 1726867621.06356: _low_level_execute_command(): starting 30575 1726867621.06359: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/ /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/AnsiballZ_service_facts.py && sleep 0' 30575 1726867621.06753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867621.06788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867621.06791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.06793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867621.06800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.06845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867621.06848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867621.06901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867621.08648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867621.08665: stderr chunk (state=3): >>><<< 30575 1726867621.08668: stdout chunk (state=3): >>><<< 30575 1726867621.08681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867621.08684: _low_level_execute_command(): starting 30575 1726867621.08688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/AnsiballZ_service_facts.py && sleep 0' 30575 1726867621.09118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867621.09121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.09123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867621.09127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867621.09129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867621.09168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867621.09171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867621.09229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867622.61641: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30575 1726867622.61681: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867622.63187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867622.63262: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867622.63266: stdout chunk (state=3): >>><<< 30575 1726867622.63268: stderr chunk (state=3): >>><<< 30575 1726867622.63484: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867622.64383: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867622.64398: _low_level_execute_command(): starting 30575 1726867622.64408: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867621.0061939-33295-138455682369673/ > /dev/null 2>&1 && sleep 0' 30575 1726867622.65018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867622.65031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867622.65044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867622.65062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867622.65081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867622.65092: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867622.65105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.65125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867622.65136: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867622.65146: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867622.65156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867622.65173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867622.65191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867622.65202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867622.65283: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867622.65306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867622.65386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867622.67236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867622.67293: stderr chunk (state=3): >>><<< 30575 1726867622.67302: stdout chunk (state=3): >>><<< 30575 1726867622.67320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867622.67331: handler run complete 30575 1726867622.67529: variable 'ansible_facts' from source: unknown 30575 1726867622.67695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867622.68229: variable 'ansible_facts' from source: unknown 30575 1726867622.68380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867622.68602: attempt loop complete, returning result 30575 1726867622.68612: _execute() done 30575 1726867622.68618: dumping result to json 30575 1726867622.68686: done dumping result, returning 30575 1726867622.68710: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-0000000012d6] 30575 1726867622.68720: sending task result for task 0affcac9-a3a5-e081-a588-0000000012d6 30575 1726867622.70668: done sending task result for task 0affcac9-a3a5-e081-a588-0000000012d6 30575 1726867622.70679: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867622.70792: no more pending results, returning what we have 30575 1726867622.70795: results queue empty 30575 1726867622.70796: checking for any_errors_fatal 30575 1726867622.70799: done checking for any_errors_fatal 30575 1726867622.70799: checking for max_fail_percentage 30575 1726867622.70801: done checking for max_fail_percentage 30575 1726867622.70801: checking to see if all hosts have failed and the running result is not ok 30575 1726867622.70802: done checking to see if all hosts have failed 30575 1726867622.70803: getting the remaining hosts for this loop 30575 1726867622.70804: done getting the remaining hosts for this loop 30575 1726867622.70808: getting the next task for host managed_node3 30575 1726867622.70817: done getting next task for host managed_node3 30575 1726867622.70821: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867622.70827: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867622.70839: getting variables 30575 1726867622.70844: in VariableManager get_vars() 30575 1726867622.70871: Calling all_inventory to load vars for managed_node3 30575 1726867622.70874: Calling groups_inventory to load vars for managed_node3 30575 1726867622.70879: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867622.70899: Calling all_plugins_play to load vars for managed_node3 30575 1726867622.70903: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867622.70906: Calling groups_plugins_play to load vars for managed_node3 30575 1726867622.72104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867622.73969: done with get_vars() 30575 1726867622.74292: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:02 -0400 (0:00:01.777) 0:00:58.121 ****** 30575 1726867622.74385: entering _queue_task() for managed_node3/package_facts 30575 1726867622.75030: worker is 1 (out of 1 available) 30575 1726867622.75042: exiting _queue_task() for managed_node3/package_facts 30575 1726867622.75055: done queuing things up, now waiting for results queue to drain 30575 1726867622.75056: waiting for pending results... 30575 1726867622.75507: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867622.75838: in run() - task 0affcac9-a3a5-e081-a588-0000000012d7 30575 1726867622.75842: variable 'ansible_search_path' from source: unknown 30575 1726867622.75845: variable 'ansible_search_path' from source: unknown 30575 1726867622.75847: calling self._execute() 30575 1726867622.75942: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867622.75957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867622.75972: variable 'omit' from source: magic vars 30575 1726867622.76417: variable 'ansible_distribution_major_version' from source: facts 30575 1726867622.76434: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867622.76444: variable 'omit' from source: magic vars 30575 1726867622.76584: variable 'omit' from source: magic vars 30575 1726867622.76587: variable 'omit' from source: magic vars 30575 1726867622.76638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867622.76679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867622.76714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867622.76739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867622.76757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867622.76792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867622.76806: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867622.76910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867622.76933: Set connection var ansible_pipelining to False 30575 1726867622.76943: Set connection var ansible_shell_type to sh 30575 1726867622.76954: Set connection var ansible_shell_executable to /bin/sh 30575 1726867622.76965: Set connection var ansible_timeout to 10 30575 1726867622.76975: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867622.76988: Set connection var ansible_connection to ssh 30575 1726867622.77020: variable 'ansible_shell_executable' from source: unknown 30575 1726867622.77030: variable 'ansible_connection' from source: unknown 30575 1726867622.77040: variable 'ansible_module_compression' from source: unknown 30575 1726867622.77046: variable 'ansible_shell_type' from source: unknown 30575 1726867622.77052: variable 'ansible_shell_executable' from source: unknown 30575 1726867622.77057: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867622.77064: variable 'ansible_pipelining' from source: unknown 30575 1726867622.77070: variable 'ansible_timeout' from source: unknown 30575 1726867622.77079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867622.77345: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867622.77350: variable 'omit' from source: magic vars 30575 1726867622.77353: starting attempt loop 30575 1726867622.77355: running the handler 30575 1726867622.77359: _low_level_execute_command(): starting 30575 1726867622.77361: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867622.78098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.78163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.78216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867622.78235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867622.78287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867622.78434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867622.80145: stdout chunk (state=3): >>>/root <<< 30575 1726867622.80196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867622.80502: stderr chunk (state=3): >>><<< 30575 1726867622.80505: stdout chunk (state=3): >>><<< 30575 1726867622.80509: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867622.80519: _low_level_execute_command(): starting 30575 1726867622.80523: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073 `" && echo ansible-tmp-1726867622.8031871-33359-182086686784073="` echo /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073 `" ) && sleep 0' 30575 1726867622.81606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867622.81660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867622.81674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.81708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.81871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867622.81927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867622.82034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867622.84098: stdout chunk (state=3): >>>ansible-tmp-1726867622.8031871-33359-182086686784073=/root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073 <<< 30575 1726867622.84233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867622.84236: stdout chunk (state=3): >>><<< 30575 1726867622.84243: stderr chunk (state=3): >>><<< 30575 1726867622.84259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867622.8031871-33359-182086686784073=/root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867622.84315: variable 'ansible_module_compression' from source: unknown 30575 1726867622.84365: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867622.84562: variable 'ansible_facts' from source: unknown 30575 1726867622.84959: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/AnsiballZ_package_facts.py 30575 1726867622.85295: Sending initial data 30575 1726867622.85298: Sent initial data (162 bytes) 30575 1726867622.87202: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.87483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867622.87536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867622.89201: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867622.89247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867622.89305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp2u1ruoft /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/AnsiballZ_package_facts.py <<< 30575 1726867622.89383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/AnsiballZ_package_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp2u1ruoft" to remote "/root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/AnsiballZ_package_facts.py" <<< 30575 1726867622.91994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867622.91997: stdout chunk (state=3): >>><<< 30575 1726867622.92000: stderr chunk (state=3): >>><<< 30575 1726867622.92049: done transferring module to remote 30575 1726867622.92091: _low_level_execute_command(): starting 30575 1726867622.92106: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/ /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/AnsiballZ_package_facts.py && sleep 0' 30575 1726867622.92793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867622.92880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.92930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867622.92972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867622.93112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867622.94996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867622.95030: stderr chunk (state=3): >>><<< 30575 1726867622.95034: stdout chunk (state=3): >>><<< 30575 1726867622.95041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867622.95064: _low_level_execute_command(): starting 30575 1726867622.95067: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/AnsiballZ_package_facts.py && sleep 0' 30575 1726867622.95482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867622.95503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867622.95507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867622.95565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867622.95568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867622.95620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867623.40714: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30575 1726867623.40758: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867623.40763: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867623.42385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867623.42422: stderr chunk (state=3): >>><<< 30575 1726867623.42432: stdout chunk (state=3): >>><<< 30575 1726867623.42473: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867623.45168: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867623.45325: _low_level_execute_command(): starting 30575 1726867623.45328: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867622.8031871-33359-182086686784073/ > /dev/null 2>&1 && sleep 0' 30575 1726867623.45937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867623.45943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867623.45971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867623.45974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867623.45979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867623.45981: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867623.46037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867623.46041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867623.46043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867623.46100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867623.48182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867623.48185: stdout chunk (state=3): >>><<< 30575 1726867623.48188: stderr chunk (state=3): >>><<< 30575 1726867623.48191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867623.48193: handler run complete 30575 1726867623.49518: variable 'ansible_facts' from source: unknown 30575 1726867623.50093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.51872: variable 'ansible_facts' from source: unknown 30575 1726867623.52184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.52847: attempt loop complete, returning result 30575 1726867623.52870: _execute() done 30575 1726867623.52873: dumping result to json 30575 1726867623.53185: done dumping result, returning 30575 1726867623.53188: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-0000000012d7] 30575 1726867623.53190: sending task result for task 0affcac9-a3a5-e081-a588-0000000012d7 30575 1726867623.54544: done sending task result for task 0affcac9-a3a5-e081-a588-0000000012d7 30575 1726867623.54547: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867623.54639: no more pending results, returning what we have 30575 1726867623.54641: results queue empty 30575 1726867623.54642: checking for any_errors_fatal 30575 1726867623.54645: done checking for any_errors_fatal 30575 1726867623.54645: checking for max_fail_percentage 30575 1726867623.54646: done checking for max_fail_percentage 30575 1726867623.54647: checking to see if all hosts have failed and the running result is not ok 30575 1726867623.54647: done checking to see if all hosts have failed 30575 1726867623.54648: getting the remaining hosts for this loop 30575 1726867623.54649: done getting the remaining hosts for this loop 30575 1726867623.54651: getting the next task for host managed_node3 30575 1726867623.54656: done getting next task for host managed_node3 30575 1726867623.54659: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867623.54662: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867623.54670: getting variables 30575 1726867623.54671: in VariableManager get_vars() 30575 1726867623.54694: Calling all_inventory to load vars for managed_node3 30575 1726867623.54696: Calling groups_inventory to load vars for managed_node3 30575 1726867623.54697: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867623.54703: Calling all_plugins_play to load vars for managed_node3 30575 1726867623.54705: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867623.54707: Calling groups_plugins_play to load vars for managed_node3 30575 1726867623.55394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.56635: done with get_vars() 30575 1726867623.56656: done getting variables 30575 1726867623.56716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:03 -0400 (0:00:00.823) 0:00:58.945 ****** 30575 1726867623.56746: entering _queue_task() for managed_node3/debug 30575 1726867623.57039: worker is 1 (out of 1 available) 30575 1726867623.57052: exiting _queue_task() for managed_node3/debug 30575 1726867623.57066: done queuing things up, now waiting for results queue to drain 30575 1726867623.57067: waiting for pending results... 30575 1726867623.57495: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867623.57499: in run() - task 0affcac9-a3a5-e081-a588-00000000127b 30575 1726867623.57507: variable 'ansible_search_path' from source: unknown 30575 1726867623.57515: variable 'ansible_search_path' from source: unknown 30575 1726867623.57556: calling self._execute() 30575 1726867623.57676: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.57685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.57698: variable 'omit' from source: magic vars 30575 1726867623.57999: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.58007: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867623.58022: variable 'omit' from source: magic vars 30575 1726867623.58064: variable 'omit' from source: magic vars 30575 1726867623.58133: variable 'network_provider' from source: set_fact 30575 1726867623.58147: variable 'omit' from source: magic vars 30575 1726867623.58184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867623.58210: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867623.58226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867623.58239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867623.58249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867623.58278: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867623.58282: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.58284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.58350: Set connection var ansible_pipelining to False 30575 1726867623.58354: Set connection var ansible_shell_type to sh 30575 1726867623.58358: Set connection var ansible_shell_executable to /bin/sh 30575 1726867623.58364: Set connection var ansible_timeout to 10 30575 1726867623.58370: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867623.58378: Set connection var ansible_connection to ssh 30575 1726867623.58399: variable 'ansible_shell_executable' from source: unknown 30575 1726867623.58402: variable 'ansible_connection' from source: unknown 30575 1726867623.58405: variable 'ansible_module_compression' from source: unknown 30575 1726867623.58408: variable 'ansible_shell_type' from source: unknown 30575 1726867623.58410: variable 'ansible_shell_executable' from source: unknown 30575 1726867623.58412: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.58418: variable 'ansible_pipelining' from source: unknown 30575 1726867623.58420: variable 'ansible_timeout' from source: unknown 30575 1726867623.58422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.58523: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867623.58533: variable 'omit' from source: magic vars 30575 1726867623.58538: starting attempt loop 30575 1726867623.58541: running the handler 30575 1726867623.58592: handler run complete 30575 1726867623.58603: attempt loop complete, returning result 30575 1726867623.58606: _execute() done 30575 1726867623.58610: dumping result to json 30575 1726867623.58615: done dumping result, returning 30575 1726867623.58618: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-00000000127b] 30575 1726867623.58624: sending task result for task 0affcac9-a3a5-e081-a588-00000000127b 30575 1726867623.58702: done sending task result for task 0affcac9-a3a5-e081-a588-00000000127b 30575 1726867623.58706: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867623.58780: no more pending results, returning what we have 30575 1726867623.58783: results queue empty 30575 1726867623.58784: checking for any_errors_fatal 30575 1726867623.58790: done checking for any_errors_fatal 30575 1726867623.58790: checking for max_fail_percentage 30575 1726867623.58792: done checking for max_fail_percentage 30575 1726867623.58793: checking to see if all hosts have failed and the running result is not ok 30575 1726867623.58793: done checking to see if all hosts have failed 30575 1726867623.58794: getting the remaining hosts for this loop 30575 1726867623.58795: done getting the remaining hosts for this loop 30575 1726867623.58799: getting the next task for host managed_node3 30575 1726867623.58806: done getting next task for host managed_node3 30575 1726867623.58810: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867623.58817: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867623.58828: getting variables 30575 1726867623.58829: in VariableManager get_vars() 30575 1726867623.58871: Calling all_inventory to load vars for managed_node3 30575 1726867623.58873: Calling groups_inventory to load vars for managed_node3 30575 1726867623.58875: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867623.58885: Calling all_plugins_play to load vars for managed_node3 30575 1726867623.58887: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867623.58889: Calling groups_plugins_play to load vars for managed_node3 30575 1726867623.59728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.60851: done with get_vars() 30575 1726867623.60872: done getting variables 30575 1726867623.60929: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:03 -0400 (0:00:00.042) 0:00:58.987 ****** 30575 1726867623.60966: entering _queue_task() for managed_node3/fail 30575 1726867623.61227: worker is 1 (out of 1 available) 30575 1726867623.61240: exiting _queue_task() for managed_node3/fail 30575 1726867623.61254: done queuing things up, now waiting for results queue to drain 30575 1726867623.61256: waiting for pending results... 30575 1726867623.61478: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867623.61582: in run() - task 0affcac9-a3a5-e081-a588-00000000127c 30575 1726867623.61595: variable 'ansible_search_path' from source: unknown 30575 1726867623.61599: variable 'ansible_search_path' from source: unknown 30575 1726867623.61626: calling self._execute() 30575 1726867623.61695: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.61700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.61717: variable 'omit' from source: magic vars 30575 1726867623.61982: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.61991: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867623.62080: variable 'network_state' from source: role '' defaults 30575 1726867623.62088: Evaluated conditional (network_state != {}): False 30575 1726867623.62092: when evaluation is False, skipping this task 30575 1726867623.62094: _execute() done 30575 1726867623.62097: dumping result to json 30575 1726867623.62099: done dumping result, returning 30575 1726867623.62107: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-00000000127c] 30575 1726867623.62112: sending task result for task 0affcac9-a3a5-e081-a588-00000000127c 30575 1726867623.62202: done sending task result for task 0affcac9-a3a5-e081-a588-00000000127c 30575 1726867623.62205: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867623.62252: no more pending results, returning what we have 30575 1726867623.62255: results queue empty 30575 1726867623.62256: checking for any_errors_fatal 30575 1726867623.62261: done checking for any_errors_fatal 30575 1726867623.62261: checking for max_fail_percentage 30575 1726867623.62263: done checking for max_fail_percentage 30575 1726867623.62264: checking to see if all hosts have failed and the running result is not ok 30575 1726867623.62264: done checking to see if all hosts have failed 30575 1726867623.62265: getting the remaining hosts for this loop 30575 1726867623.62266: done getting the remaining hosts for this loop 30575 1726867623.62270: getting the next task for host managed_node3 30575 1726867623.62276: done getting next task for host managed_node3 30575 1726867623.62282: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867623.62286: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867623.62304: getting variables 30575 1726867623.62305: in VariableManager get_vars() 30575 1726867623.62338: Calling all_inventory to load vars for managed_node3 30575 1726867623.62340: Calling groups_inventory to load vars for managed_node3 30575 1726867623.62342: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867623.62349: Calling all_plugins_play to load vars for managed_node3 30575 1726867623.62352: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867623.62354: Calling groups_plugins_play to load vars for managed_node3 30575 1726867623.63106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.63966: done with get_vars() 30575 1726867623.63983: done getting variables 30575 1726867623.64024: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:03 -0400 (0:00:00.030) 0:00:59.018 ****** 30575 1726867623.64049: entering _queue_task() for managed_node3/fail 30575 1726867623.64237: worker is 1 (out of 1 available) 30575 1726867623.64251: exiting _queue_task() for managed_node3/fail 30575 1726867623.64263: done queuing things up, now waiting for results queue to drain 30575 1726867623.64264: waiting for pending results... 30575 1726867623.64448: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867623.64593: in run() - task 0affcac9-a3a5-e081-a588-00000000127d 30575 1726867623.64602: variable 'ansible_search_path' from source: unknown 30575 1726867623.64604: variable 'ansible_search_path' from source: unknown 30575 1726867623.64621: calling self._execute() 30575 1726867623.64783: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.64787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.64791: variable 'omit' from source: magic vars 30575 1726867623.65109: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.65123: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867623.65237: variable 'network_state' from source: role '' defaults 30575 1726867623.65249: Evaluated conditional (network_state != {}): False 30575 1726867623.65252: when evaluation is False, skipping this task 30575 1726867623.65255: _execute() done 30575 1726867623.65257: dumping result to json 30575 1726867623.65261: done dumping result, returning 30575 1726867623.65264: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-00000000127d] 30575 1726867623.65266: sending task result for task 0affcac9-a3a5-e081-a588-00000000127d 30575 1726867623.65411: done sending task result for task 0affcac9-a3a5-e081-a588-00000000127d 30575 1726867623.65417: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867623.65517: no more pending results, returning what we have 30575 1726867623.65521: results queue empty 30575 1726867623.65522: checking for any_errors_fatal 30575 1726867623.65528: done checking for any_errors_fatal 30575 1726867623.65529: checking for max_fail_percentage 30575 1726867623.65530: done checking for max_fail_percentage 30575 1726867623.65531: checking to see if all hosts have failed and the running result is not ok 30575 1726867623.65532: done checking to see if all hosts have failed 30575 1726867623.65532: getting the remaining hosts for this loop 30575 1726867623.65534: done getting the remaining hosts for this loop 30575 1726867623.65537: getting the next task for host managed_node3 30575 1726867623.65543: done getting next task for host managed_node3 30575 1726867623.65547: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867623.65552: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867623.65569: getting variables 30575 1726867623.65571: in VariableManager get_vars() 30575 1726867623.65623: Calling all_inventory to load vars for managed_node3 30575 1726867623.65626: Calling groups_inventory to load vars for managed_node3 30575 1726867623.65635: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867623.65644: Calling all_plugins_play to load vars for managed_node3 30575 1726867623.65647: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867623.65650: Calling groups_plugins_play to load vars for managed_node3 30575 1726867623.66624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.67491: done with get_vars() 30575 1726867623.67506: done getting variables 30575 1726867623.67549: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:03 -0400 (0:00:00.035) 0:00:59.053 ****** 30575 1726867623.67573: entering _queue_task() for managed_node3/fail 30575 1726867623.67827: worker is 1 (out of 1 available) 30575 1726867623.67840: exiting _queue_task() for managed_node3/fail 30575 1726867623.67852: done queuing things up, now waiting for results queue to drain 30575 1726867623.67854: waiting for pending results... 30575 1726867623.68295: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867623.68302: in run() - task 0affcac9-a3a5-e081-a588-00000000127e 30575 1726867623.68324: variable 'ansible_search_path' from source: unknown 30575 1726867623.68333: variable 'ansible_search_path' from source: unknown 30575 1726867623.68374: calling self._execute() 30575 1726867623.68467: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.68480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.68496: variable 'omit' from source: magic vars 30575 1726867623.68866: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.68884: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867623.69063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867623.71367: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867623.71682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867623.71686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867623.71689: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867623.71691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867623.71694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.71696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.71698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.71748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.71769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.71869: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.71894: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867623.72001: variable 'ansible_distribution' from source: facts 30575 1726867623.72009: variable '__network_rh_distros' from source: role '' defaults 30575 1726867623.72026: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867623.72283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.72361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.72364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.72399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.72422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.72483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.72515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.72546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.72684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.72687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.72690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.72693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.72716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.72761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.72783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.73123: variable 'network_connections' from source: include params 30575 1726867623.73141: variable 'interface' from source: play vars 30575 1726867623.73208: variable 'interface' from source: play vars 30575 1726867623.73233: variable 'network_state' from source: role '' defaults 30575 1726867623.73306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867623.73484: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867623.73528: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867623.73568: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867623.73604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867623.73673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867623.73774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867623.73786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.73789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867623.73803: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867623.73811: when evaluation is False, skipping this task 30575 1726867623.73822: _execute() done 30575 1726867623.73830: dumping result to json 30575 1726867623.73839: done dumping result, returning 30575 1726867623.73851: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-00000000127e] 30575 1726867623.73861: sending task result for task 0affcac9-a3a5-e081-a588-00000000127e skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867623.74041: no more pending results, returning what we have 30575 1726867623.74045: results queue empty 30575 1726867623.74046: checking for any_errors_fatal 30575 1726867623.74052: done checking for any_errors_fatal 30575 1726867623.74052: checking for max_fail_percentage 30575 1726867623.74055: done checking for max_fail_percentage 30575 1726867623.74056: checking to see if all hosts have failed and the running result is not ok 30575 1726867623.74057: done checking to see if all hosts have failed 30575 1726867623.74057: getting the remaining hosts for this loop 30575 1726867623.74060: done getting the remaining hosts for this loop 30575 1726867623.74065: getting the next task for host managed_node3 30575 1726867623.74073: done getting next task for host managed_node3 30575 1726867623.74080: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867623.74085: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867623.74111: getting variables 30575 1726867623.74116: in VariableManager get_vars() 30575 1726867623.74160: Calling all_inventory to load vars for managed_node3 30575 1726867623.74162: Calling groups_inventory to load vars for managed_node3 30575 1726867623.74165: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867623.74176: Calling all_plugins_play to load vars for managed_node3 30575 1726867623.74385: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867623.74390: Calling groups_plugins_play to load vars for managed_node3 30575 1726867623.75090: done sending task result for task 0affcac9-a3a5-e081-a588-00000000127e 30575 1726867623.75093: WORKER PROCESS EXITING 30575 1726867623.75889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.77632: done with get_vars() 30575 1726867623.77653: done getting variables 30575 1726867623.77711: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:03 -0400 (0:00:00.101) 0:00:59.155 ****** 30575 1726867623.77747: entering _queue_task() for managed_node3/dnf 30575 1726867623.78064: worker is 1 (out of 1 available) 30575 1726867623.78078: exiting _queue_task() for managed_node3/dnf 30575 1726867623.78091: done queuing things up, now waiting for results queue to drain 30575 1726867623.78093: waiting for pending results... 30575 1726867623.78387: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867623.78527: in run() - task 0affcac9-a3a5-e081-a588-00000000127f 30575 1726867623.78544: variable 'ansible_search_path' from source: unknown 30575 1726867623.78550: variable 'ansible_search_path' from source: unknown 30575 1726867623.78594: calling self._execute() 30575 1726867623.78706: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.78727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.78745: variable 'omit' from source: magic vars 30575 1726867623.79154: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.79176: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867623.79391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867623.81712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867623.81797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867623.81839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867623.81885: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867623.81918: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867623.81999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.82036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.82065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.82118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.82137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.82249: variable 'ansible_distribution' from source: facts 30575 1726867623.82258: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.82276: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867623.82392: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867623.82535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.82562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.82593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.82642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.82660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.82704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.82737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.82767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.82810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.82832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.82882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.82909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.82961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.82987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.83005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.83167: variable 'network_connections' from source: include params 30575 1726867623.83284: variable 'interface' from source: play vars 30575 1726867623.83287: variable 'interface' from source: play vars 30575 1726867623.83332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867623.83511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867623.83556: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867623.83591: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867623.83632: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867623.83692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867623.83729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867623.83769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.83801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867623.83851: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867623.84095: variable 'network_connections' from source: include params 30575 1726867623.84106: variable 'interface' from source: play vars 30575 1726867623.84181: variable 'interface' from source: play vars 30575 1726867623.84211: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867623.84264: when evaluation is False, skipping this task 30575 1726867623.84268: _execute() done 30575 1726867623.84270: dumping result to json 30575 1726867623.84272: done dumping result, returning 30575 1726867623.84274: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000127f] 30575 1726867623.84278: sending task result for task 0affcac9-a3a5-e081-a588-00000000127f skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867623.84424: no more pending results, returning what we have 30575 1726867623.84428: results queue empty 30575 1726867623.84429: checking for any_errors_fatal 30575 1726867623.84438: done checking for any_errors_fatal 30575 1726867623.84439: checking for max_fail_percentage 30575 1726867623.84442: done checking for max_fail_percentage 30575 1726867623.84443: checking to see if all hosts have failed and the running result is not ok 30575 1726867623.84444: done checking to see if all hosts have failed 30575 1726867623.84444: getting the remaining hosts for this loop 30575 1726867623.84447: done getting the remaining hosts for this loop 30575 1726867623.84451: getting the next task for host managed_node3 30575 1726867623.84462: done getting next task for host managed_node3 30575 1726867623.84466: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867623.84472: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867623.84500: getting variables 30575 1726867623.84502: in VariableManager get_vars() 30575 1726867623.84547: Calling all_inventory to load vars for managed_node3 30575 1726867623.84550: Calling groups_inventory to load vars for managed_node3 30575 1726867623.84553: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867623.84564: Calling all_plugins_play to load vars for managed_node3 30575 1726867623.84567: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867623.84570: Calling groups_plugins_play to load vars for managed_node3 30575 1726867623.85301: done sending task result for task 0affcac9-a3a5-e081-a588-00000000127f 30575 1726867623.85304: WORKER PROCESS EXITING 30575 1726867623.86258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867623.88452: done with get_vars() 30575 1726867623.88493: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867623.88629: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:03 -0400 (0:00:00.109) 0:00:59.264 ****** 30575 1726867623.88662: entering _queue_task() for managed_node3/yum 30575 1726867623.89012: worker is 1 (out of 1 available) 30575 1726867623.89027: exiting _queue_task() for managed_node3/yum 30575 1726867623.89040: done queuing things up, now waiting for results queue to drain 30575 1726867623.89042: waiting for pending results... 30575 1726867623.89339: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867623.89486: in run() - task 0affcac9-a3a5-e081-a588-000000001280 30575 1726867623.89504: variable 'ansible_search_path' from source: unknown 30575 1726867623.89509: variable 'ansible_search_path' from source: unknown 30575 1726867623.89544: calling self._execute() 30575 1726867623.89638: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867623.89644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867623.89655: variable 'omit' from source: magic vars 30575 1726867623.90084: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.90087: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867623.90267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867623.94130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867623.94345: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867623.94350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867623.94383: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867623.94486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867623.94782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867623.94786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867623.94789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867623.94888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867623.94910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867623.95123: variable 'ansible_distribution_major_version' from source: facts 30575 1726867623.95145: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867623.95154: when evaluation is False, skipping this task 30575 1726867623.95162: _execute() done 30575 1726867623.95170: dumping result to json 30575 1726867623.95181: done dumping result, returning 30575 1726867623.95223: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001280] 30575 1726867623.95235: sending task result for task 0affcac9-a3a5-e081-a588-000000001280 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867623.95447: no more pending results, returning what we have 30575 1726867623.95451: results queue empty 30575 1726867623.95452: checking for any_errors_fatal 30575 1726867623.95458: done checking for any_errors_fatal 30575 1726867623.95459: checking for max_fail_percentage 30575 1726867623.95461: done checking for max_fail_percentage 30575 1726867623.95462: checking to see if all hosts have failed and the running result is not ok 30575 1726867623.95464: done checking to see if all hosts have failed 30575 1726867623.95464: getting the remaining hosts for this loop 30575 1726867623.95466: done getting the remaining hosts for this loop 30575 1726867623.95471: getting the next task for host managed_node3 30575 1726867623.95482: done getting next task for host managed_node3 30575 1726867623.95487: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867623.95492: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867623.95520: getting variables 30575 1726867623.95522: in VariableManager get_vars() 30575 1726867623.95562: Calling all_inventory to load vars for managed_node3 30575 1726867623.95565: Calling groups_inventory to load vars for managed_node3 30575 1726867623.95567: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867623.95576: Calling all_plugins_play to load vars for managed_node3 30575 1726867623.95888: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867623.95892: Calling groups_plugins_play to load vars for managed_node3 30575 1726867623.96605: done sending task result for task 0affcac9-a3a5-e081-a588-000000001280 30575 1726867623.96608: WORKER PROCESS EXITING 30575 1726867623.98992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867624.07183: done with get_vars() 30575 1726867624.07211: done getting variables 30575 1726867624.07267: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:04 -0400 (0:00:00.186) 0:00:59.450 ****** 30575 1726867624.07300: entering _queue_task() for managed_node3/fail 30575 1726867624.07815: worker is 1 (out of 1 available) 30575 1726867624.07828: exiting _queue_task() for managed_node3/fail 30575 1726867624.07840: done queuing things up, now waiting for results queue to drain 30575 1726867624.07842: waiting for pending results... 30575 1726867624.08180: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867624.08343: in run() - task 0affcac9-a3a5-e081-a588-000000001281 30575 1726867624.08361: variable 'ansible_search_path' from source: unknown 30575 1726867624.08368: variable 'ansible_search_path' from source: unknown 30575 1726867624.08423: calling self._execute() 30575 1726867624.08546: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.08560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.08576: variable 'omit' from source: magic vars 30575 1726867624.08992: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.09067: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867624.09146: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867624.09366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867624.11724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867624.11812: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867624.11854: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867624.11901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867624.11932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867624.12020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.12117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.12121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.12135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.12155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.12209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.12241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.12268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.12316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.12340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.12391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.12419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.12451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.12549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.12552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.12692: variable 'network_connections' from source: include params 30575 1726867624.12709: variable 'interface' from source: play vars 30575 1726867624.12783: variable 'interface' from source: play vars 30575 1726867624.12859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867624.13034: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867624.13079: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867624.13118: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867624.13150: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867624.13282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867624.13285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867624.13288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.13290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867624.13334: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867624.13574: variable 'network_connections' from source: include params 30575 1726867624.13587: variable 'interface' from source: play vars 30575 1726867624.13652: variable 'interface' from source: play vars 30575 1726867624.13682: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867624.13691: when evaluation is False, skipping this task 30575 1726867624.13697: _execute() done 30575 1726867624.13703: dumping result to json 30575 1726867624.13710: done dumping result, returning 30575 1726867624.13721: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001281] 30575 1726867624.13731: sending task result for task 0affcac9-a3a5-e081-a588-000000001281 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867624.13903: no more pending results, returning what we have 30575 1726867624.13907: results queue empty 30575 1726867624.13907: checking for any_errors_fatal 30575 1726867624.13919: done checking for any_errors_fatal 30575 1726867624.13920: checking for max_fail_percentage 30575 1726867624.13922: done checking for max_fail_percentage 30575 1726867624.13923: checking to see if all hosts have failed and the running result is not ok 30575 1726867624.13924: done checking to see if all hosts have failed 30575 1726867624.13925: getting the remaining hosts for this loop 30575 1726867624.13926: done getting the remaining hosts for this loop 30575 1726867624.13930: getting the next task for host managed_node3 30575 1726867624.13941: done getting next task for host managed_node3 30575 1726867624.13946: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867624.13952: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867624.13979: getting variables 30575 1726867624.13981: in VariableManager get_vars() 30575 1726867624.14024: Calling all_inventory to load vars for managed_node3 30575 1726867624.14027: Calling groups_inventory to load vars for managed_node3 30575 1726867624.14030: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867624.14041: Calling all_plugins_play to load vars for managed_node3 30575 1726867624.14044: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867624.14047: Calling groups_plugins_play to load vars for managed_node3 30575 1726867624.14893: done sending task result for task 0affcac9-a3a5-e081-a588-000000001281 30575 1726867624.14896: WORKER PROCESS EXITING 30575 1726867624.15700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867624.18473: done with get_vars() 30575 1726867624.18500: done getting variables 30575 1726867624.18559: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:04 -0400 (0:00:00.112) 0:00:59.563 ****** 30575 1726867624.18597: entering _queue_task() for managed_node3/package 30575 1726867624.19425: worker is 1 (out of 1 available) 30575 1726867624.19436: exiting _queue_task() for managed_node3/package 30575 1726867624.19446: done queuing things up, now waiting for results queue to drain 30575 1726867624.19448: waiting for pending results... 30575 1726867624.19700: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867624.20364: in run() - task 0affcac9-a3a5-e081-a588-000000001282 30575 1726867624.20378: variable 'ansible_search_path' from source: unknown 30575 1726867624.20382: variable 'ansible_search_path' from source: unknown 30575 1726867624.20492: calling self._execute() 30575 1726867624.20716: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.20725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.20738: variable 'omit' from source: magic vars 30575 1726867624.21630: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.21641: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867624.22082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867624.22713: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867624.22762: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867624.22797: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867624.23071: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867624.23311: variable 'network_packages' from source: role '' defaults 30575 1726867624.23538: variable '__network_provider_setup' from source: role '' defaults 30575 1726867624.23548: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867624.23728: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867624.23742: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867624.23862: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867624.24296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867624.28952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867624.28969: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867624.29008: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867624.29049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867624.29082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867624.29173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.29206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.29476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.29482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.29484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.29487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.29489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.29491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.29493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.29495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.29732: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867624.29872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.29897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.29933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.29973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.29989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.30094: variable 'ansible_python' from source: facts 30575 1726867624.30123: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867624.30206: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867624.30448: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867624.30451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.30466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.30497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.30538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.30552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.30610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.30641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.30658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.30706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.30727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.30881: variable 'network_connections' from source: include params 30575 1726867624.30896: variable 'interface' from source: play vars 30575 1726867624.31005: variable 'interface' from source: play vars 30575 1726867624.31085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867624.31282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867624.31286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.31288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867624.31291: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867624.31555: variable 'network_connections' from source: include params 30575 1726867624.31558: variable 'interface' from source: play vars 30575 1726867624.31669: variable 'interface' from source: play vars 30575 1726867624.31723: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867624.31812: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867624.32214: variable 'network_connections' from source: include params 30575 1726867624.32217: variable 'interface' from source: play vars 30575 1726867624.32401: variable 'interface' from source: play vars 30575 1726867624.32426: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867624.32908: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867624.33419: variable 'network_connections' from source: include params 30575 1726867624.33424: variable 'interface' from source: play vars 30575 1726867624.33511: variable 'interface' from source: play vars 30575 1726867624.33571: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867624.33630: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867624.33636: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867624.33706: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867624.33936: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867624.34444: variable 'network_connections' from source: include params 30575 1726867624.34447: variable 'interface' from source: play vars 30575 1726867624.34506: variable 'interface' from source: play vars 30575 1726867624.34583: variable 'ansible_distribution' from source: facts 30575 1726867624.34586: variable '__network_rh_distros' from source: role '' defaults 30575 1726867624.34589: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.34591: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867624.34725: variable 'ansible_distribution' from source: facts 30575 1726867624.34728: variable '__network_rh_distros' from source: role '' defaults 30575 1726867624.34733: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.34755: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867624.34937: variable 'ansible_distribution' from source: facts 30575 1726867624.34940: variable '__network_rh_distros' from source: role '' defaults 30575 1726867624.34945: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.34991: variable 'network_provider' from source: set_fact 30575 1726867624.35008: variable 'ansible_facts' from source: unknown 30575 1726867624.35960: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867624.35963: when evaluation is False, skipping this task 30575 1726867624.35966: _execute() done 30575 1726867624.35968: dumping result to json 30575 1726867624.35970: done dumping result, returning 30575 1726867624.35974: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000001282] 30575 1726867624.35978: sending task result for task 0affcac9-a3a5-e081-a588-000000001282 30575 1726867624.36249: done sending task result for task 0affcac9-a3a5-e081-a588-000000001282 30575 1726867624.36252: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867624.36313: no more pending results, returning what we have 30575 1726867624.36317: results queue empty 30575 1726867624.36318: checking for any_errors_fatal 30575 1726867624.36324: done checking for any_errors_fatal 30575 1726867624.36325: checking for max_fail_percentage 30575 1726867624.36326: done checking for max_fail_percentage 30575 1726867624.36327: checking to see if all hosts have failed and the running result is not ok 30575 1726867624.36328: done checking to see if all hosts have failed 30575 1726867624.36329: getting the remaining hosts for this loop 30575 1726867624.36331: done getting the remaining hosts for this loop 30575 1726867624.36335: getting the next task for host managed_node3 30575 1726867624.36344: done getting next task for host managed_node3 30575 1726867624.36348: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867624.36353: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867624.36385: getting variables 30575 1726867624.36387: in VariableManager get_vars() 30575 1726867624.36433: Calling all_inventory to load vars for managed_node3 30575 1726867624.36435: Calling groups_inventory to load vars for managed_node3 30575 1726867624.36437: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867624.36447: Calling all_plugins_play to load vars for managed_node3 30575 1726867624.36450: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867624.36452: Calling groups_plugins_play to load vars for managed_node3 30575 1726867624.39368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867624.40960: done with get_vars() 30575 1726867624.40986: done getting variables 30575 1726867624.41047: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:04 -0400 (0:00:00.224) 0:00:59.788 ****** 30575 1726867624.41088: entering _queue_task() for managed_node3/package 30575 1726867624.42213: worker is 1 (out of 1 available) 30575 1726867624.42224: exiting _queue_task() for managed_node3/package 30575 1726867624.42235: done queuing things up, now waiting for results queue to drain 30575 1726867624.42237: waiting for pending results... 30575 1726867624.42654: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867624.42698: in run() - task 0affcac9-a3a5-e081-a588-000000001283 30575 1726867624.42722: variable 'ansible_search_path' from source: unknown 30575 1726867624.42732: variable 'ansible_search_path' from source: unknown 30575 1726867624.42782: calling self._execute() 30575 1726867624.42885: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.42898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.42914: variable 'omit' from source: magic vars 30575 1726867624.43306: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.43324: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867624.43451: variable 'network_state' from source: role '' defaults 30575 1726867624.43467: Evaluated conditional (network_state != {}): False 30575 1726867624.43475: when evaluation is False, skipping this task 30575 1726867624.43486: _execute() done 30575 1726867624.43493: dumping result to json 30575 1726867624.43500: done dumping result, returning 30575 1726867624.43515: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001283] 30575 1726867624.43525: sending task result for task 0affcac9-a3a5-e081-a588-000000001283 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867624.43754: no more pending results, returning what we have 30575 1726867624.43759: results queue empty 30575 1726867624.43760: checking for any_errors_fatal 30575 1726867624.43766: done checking for any_errors_fatal 30575 1726867624.43767: checking for max_fail_percentage 30575 1726867624.43770: done checking for max_fail_percentage 30575 1726867624.43771: checking to see if all hosts have failed and the running result is not ok 30575 1726867624.43772: done checking to see if all hosts have failed 30575 1726867624.43772: getting the remaining hosts for this loop 30575 1726867624.43774: done getting the remaining hosts for this loop 30575 1726867624.43779: getting the next task for host managed_node3 30575 1726867624.43789: done getting next task for host managed_node3 30575 1726867624.43793: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867624.43799: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867624.43828: getting variables 30575 1726867624.43830: in VariableManager get_vars() 30575 1726867624.43873: Calling all_inventory to load vars for managed_node3 30575 1726867624.43876: Calling groups_inventory to load vars for managed_node3 30575 1726867624.43984: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867624.43996: Calling all_plugins_play to load vars for managed_node3 30575 1726867624.44000: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867624.44002: Calling groups_plugins_play to load vars for managed_node3 30575 1726867624.44690: done sending task result for task 0affcac9-a3a5-e081-a588-000000001283 30575 1726867624.44693: WORKER PROCESS EXITING 30575 1726867624.45703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867624.48554: done with get_vars() 30575 1726867624.48588: done getting variables 30575 1726867624.48652: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:04 -0400 (0:00:00.076) 0:00:59.864 ****** 30575 1726867624.48704: entering _queue_task() for managed_node3/package 30575 1726867624.49282: worker is 1 (out of 1 available) 30575 1726867624.49293: exiting _queue_task() for managed_node3/package 30575 1726867624.49304: done queuing things up, now waiting for results queue to drain 30575 1726867624.49305: waiting for pending results... 30575 1726867624.49401: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867624.49571: in run() - task 0affcac9-a3a5-e081-a588-000000001284 30575 1726867624.49594: variable 'ansible_search_path' from source: unknown 30575 1726867624.49604: variable 'ansible_search_path' from source: unknown 30575 1726867624.49652: calling self._execute() 30575 1726867624.49751: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.49763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.49776: variable 'omit' from source: magic vars 30575 1726867624.50519: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.50536: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867624.50767: variable 'network_state' from source: role '' defaults 30575 1726867624.50835: Evaluated conditional (network_state != {}): False 30575 1726867624.50843: when evaluation is False, skipping this task 30575 1726867624.50850: _execute() done 30575 1726867624.50984: dumping result to json 30575 1726867624.50988: done dumping result, returning 30575 1726867624.50991: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001284] 30575 1726867624.50993: sending task result for task 0affcac9-a3a5-e081-a588-000000001284 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867624.51124: no more pending results, returning what we have 30575 1726867624.51129: results queue empty 30575 1726867624.51129: checking for any_errors_fatal 30575 1726867624.51138: done checking for any_errors_fatal 30575 1726867624.51139: checking for max_fail_percentage 30575 1726867624.51141: done checking for max_fail_percentage 30575 1726867624.51142: checking to see if all hosts have failed and the running result is not ok 30575 1726867624.51143: done checking to see if all hosts have failed 30575 1726867624.51143: getting the remaining hosts for this loop 30575 1726867624.51145: done getting the remaining hosts for this loop 30575 1726867624.51149: getting the next task for host managed_node3 30575 1726867624.51158: done getting next task for host managed_node3 30575 1726867624.51163: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867624.51169: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867624.51199: getting variables 30575 1726867624.51202: in VariableManager get_vars() 30575 1726867624.51242: Calling all_inventory to load vars for managed_node3 30575 1726867624.51245: Calling groups_inventory to load vars for managed_node3 30575 1726867624.51248: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867624.51261: Calling all_plugins_play to load vars for managed_node3 30575 1726867624.51264: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867624.51267: Calling groups_plugins_play to load vars for managed_node3 30575 1726867624.52092: done sending task result for task 0affcac9-a3a5-e081-a588-000000001284 30575 1726867624.52096: WORKER PROCESS EXITING 30575 1726867624.54252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867624.56958: done with get_vars() 30575 1726867624.56988: done getting variables 30575 1726867624.57051: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:04 -0400 (0:00:00.083) 0:00:59.948 ****** 30575 1726867624.57092: entering _queue_task() for managed_node3/service 30575 1726867624.57456: worker is 1 (out of 1 available) 30575 1726867624.57469: exiting _queue_task() for managed_node3/service 30575 1726867624.57587: done queuing things up, now waiting for results queue to drain 30575 1726867624.57589: waiting for pending results... 30575 1726867624.57789: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867624.57945: in run() - task 0affcac9-a3a5-e081-a588-000000001285 30575 1726867624.57963: variable 'ansible_search_path' from source: unknown 30575 1726867624.57974: variable 'ansible_search_path' from source: unknown 30575 1726867624.58019: calling self._execute() 30575 1726867624.58192: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.58206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.58223: variable 'omit' from source: magic vars 30575 1726867624.58806: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.58824: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867624.59040: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867624.59233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867624.62612: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867624.62631: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867624.62721: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867624.62763: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867624.62828: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867624.62916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.63050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.63052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.63054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.63056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.63189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.63218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.63248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.63298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.63314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.63352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.63383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.63412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.63454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.63471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.63774: variable 'network_connections' from source: include params 30575 1726867624.63794: variable 'interface' from source: play vars 30575 1726867624.64131: variable 'interface' from source: play vars 30575 1726867624.64171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867624.64547: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867624.64722: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867624.64808: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867624.64811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867624.64844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867624.64921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867624.64957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.64990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867624.65463: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867624.65530: variable 'network_connections' from source: include params 30575 1726867624.65540: variable 'interface' from source: play vars 30575 1726867624.65615: variable 'interface' from source: play vars 30575 1726867624.65647: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867624.65675: when evaluation is False, skipping this task 30575 1726867624.65692: _execute() done 30575 1726867624.65700: dumping result to json 30575 1726867624.65707: done dumping result, returning 30575 1726867624.65721: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001285] 30575 1726867624.65731: sending task result for task 0affcac9-a3a5-e081-a588-000000001285 30575 1726867624.65983: done sending task result for task 0affcac9-a3a5-e081-a588-000000001285 30575 1726867624.65994: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867624.66041: no more pending results, returning what we have 30575 1726867624.66045: results queue empty 30575 1726867624.66046: checking for any_errors_fatal 30575 1726867624.66054: done checking for any_errors_fatal 30575 1726867624.66055: checking for max_fail_percentage 30575 1726867624.66057: done checking for max_fail_percentage 30575 1726867624.66058: checking to see if all hosts have failed and the running result is not ok 30575 1726867624.66060: done checking to see if all hosts have failed 30575 1726867624.66060: getting the remaining hosts for this loop 30575 1726867624.66062: done getting the remaining hosts for this loop 30575 1726867624.66066: getting the next task for host managed_node3 30575 1726867624.66076: done getting next task for host managed_node3 30575 1726867624.66083: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867624.66088: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867624.66113: getting variables 30575 1726867624.66114: in VariableManager get_vars() 30575 1726867624.66156: Calling all_inventory to load vars for managed_node3 30575 1726867624.66159: Calling groups_inventory to load vars for managed_node3 30575 1726867624.66161: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867624.66172: Calling all_plugins_play to load vars for managed_node3 30575 1726867624.66175: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867624.66180: Calling groups_plugins_play to load vars for managed_node3 30575 1726867624.67815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867624.69789: done with get_vars() 30575 1726867624.69817: done getting variables 30575 1726867624.69892: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:04 -0400 (0:00:00.128) 0:01:00.076 ****** 30575 1726867624.69934: entering _queue_task() for managed_node3/service 30575 1726867624.70361: worker is 1 (out of 1 available) 30575 1726867624.70375: exiting _queue_task() for managed_node3/service 30575 1726867624.70593: done queuing things up, now waiting for results queue to drain 30575 1726867624.70595: waiting for pending results... 30575 1726867624.70937: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867624.71058: in run() - task 0affcac9-a3a5-e081-a588-000000001286 30575 1726867624.71062: variable 'ansible_search_path' from source: unknown 30575 1726867624.71065: variable 'ansible_search_path' from source: unknown 30575 1726867624.71111: calling self._execute() 30575 1726867624.71236: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.71240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.71251: variable 'omit' from source: magic vars 30575 1726867624.71710: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.71725: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867624.71880: variable 'network_provider' from source: set_fact 30575 1726867624.71886: variable 'network_state' from source: role '' defaults 30575 1726867624.71895: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867624.71901: variable 'omit' from source: magic vars 30575 1726867624.71955: variable 'omit' from source: magic vars 30575 1726867624.71975: variable 'network_service_name' from source: role '' defaults 30575 1726867624.72028: variable 'network_service_name' from source: role '' defaults 30575 1726867624.72103: variable '__network_provider_setup' from source: role '' defaults 30575 1726867624.72107: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867624.72157: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867624.72165: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867624.72210: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867624.72364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867624.74083: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867624.74097: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867624.74141: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867624.74181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867624.74216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867624.74297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.74336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.74368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.74418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.74440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.74492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.74652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.74660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.74663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.74666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.74805: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867624.74893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.74909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.74929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.74953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.74970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.75029: variable 'ansible_python' from source: facts 30575 1726867624.75041: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867624.75101: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867624.75156: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867624.75244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.75261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.75279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.75308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.75322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.75353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867624.75372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867624.75390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.75422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867624.75432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867624.75524: variable 'network_connections' from source: include params 30575 1726867624.75532: variable 'interface' from source: play vars 30575 1726867624.75587: variable 'interface' from source: play vars 30575 1726867624.75662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867624.75794: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867624.75830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867624.75865: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867624.75896: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867624.75940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867624.75965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867624.75989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867624.76011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867624.76050: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867624.76232: variable 'network_connections' from source: include params 30575 1726867624.76238: variable 'interface' from source: play vars 30575 1726867624.76293: variable 'interface' from source: play vars 30575 1726867624.76316: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867624.76369: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867624.76560: variable 'network_connections' from source: include params 30575 1726867624.76563: variable 'interface' from source: play vars 30575 1726867624.76630: variable 'interface' from source: play vars 30575 1726867624.76646: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867624.76755: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867624.77083: variable 'network_connections' from source: include params 30575 1726867624.77088: variable 'interface' from source: play vars 30575 1726867624.77090: variable 'interface' from source: play vars 30575 1726867624.77115: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867624.77174: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867624.77181: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867624.77242: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867624.77450: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867624.77937: variable 'network_connections' from source: include params 30575 1726867624.77940: variable 'interface' from source: play vars 30575 1726867624.77999: variable 'interface' from source: play vars 30575 1726867624.78012: variable 'ansible_distribution' from source: facts 30575 1726867624.78015: variable '__network_rh_distros' from source: role '' defaults 30575 1726867624.78018: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.78132: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867624.78237: variable 'ansible_distribution' from source: facts 30575 1726867624.78246: variable '__network_rh_distros' from source: role '' defaults 30575 1726867624.78249: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.78251: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867624.78368: variable 'ansible_distribution' from source: facts 30575 1726867624.78371: variable '__network_rh_distros' from source: role '' defaults 30575 1726867624.78374: variable 'ansible_distribution_major_version' from source: facts 30575 1726867624.78391: variable 'network_provider' from source: set_fact 30575 1726867624.78408: variable 'omit' from source: magic vars 30575 1726867624.78431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867624.78454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867624.78471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867624.78484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867624.78492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867624.78515: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867624.78521: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.78523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.78595: Set connection var ansible_pipelining to False 30575 1726867624.78598: Set connection var ansible_shell_type to sh 30575 1726867624.78603: Set connection var ansible_shell_executable to /bin/sh 30575 1726867624.78608: Set connection var ansible_timeout to 10 30575 1726867624.78613: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867624.78621: Set connection var ansible_connection to ssh 30575 1726867624.78641: variable 'ansible_shell_executable' from source: unknown 30575 1726867624.78644: variable 'ansible_connection' from source: unknown 30575 1726867624.78646: variable 'ansible_module_compression' from source: unknown 30575 1726867624.78648: variable 'ansible_shell_type' from source: unknown 30575 1726867624.78651: variable 'ansible_shell_executable' from source: unknown 30575 1726867624.78653: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867624.78657: variable 'ansible_pipelining' from source: unknown 30575 1726867624.78660: variable 'ansible_timeout' from source: unknown 30575 1726867624.78670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867624.78740: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867624.78748: variable 'omit' from source: magic vars 30575 1726867624.78754: starting attempt loop 30575 1726867624.78756: running the handler 30575 1726867624.78815: variable 'ansible_facts' from source: unknown 30575 1726867624.79293: _low_level_execute_command(): starting 30575 1726867624.79298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867624.79846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867624.79850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867624.79900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867624.79930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867624.80006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867624.81672: stdout chunk (state=3): >>>/root <<< 30575 1726867624.81767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867624.81797: stderr chunk (state=3): >>><<< 30575 1726867624.81800: stdout chunk (state=3): >>><<< 30575 1726867624.81821: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867624.81832: _low_level_execute_command(): starting 30575 1726867624.81838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704 `" && echo ansible-tmp-1726867624.8182123-33457-269828461669704="` echo /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704 `" ) && sleep 0' 30575 1726867624.82247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867624.82281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867624.82285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867624.82287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867624.82289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867624.82292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867624.82293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867624.82349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867624.82352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867624.82398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867624.84302: stdout chunk (state=3): >>>ansible-tmp-1726867624.8182123-33457-269828461669704=/root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704 <<< 30575 1726867624.84413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867624.84435: stderr chunk (state=3): >>><<< 30575 1726867624.84438: stdout chunk (state=3): >>><<< 30575 1726867624.84453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867624.8182123-33457-269828461669704=/root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867624.84481: variable 'ansible_module_compression' from source: unknown 30575 1726867624.84524: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867624.84578: variable 'ansible_facts' from source: unknown 30575 1726867624.84717: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/AnsiballZ_systemd.py 30575 1726867624.84808: Sending initial data 30575 1726867624.84811: Sent initial data (156 bytes) 30575 1726867624.85240: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867624.85244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867624.85250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867624.85252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867624.85254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867624.85298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867624.85301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867624.85352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867624.86894: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867624.86901: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867624.86936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867624.86981: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp5u749u8z /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/AnsiballZ_systemd.py <<< 30575 1726867624.86985: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/AnsiballZ_systemd.py" <<< 30575 1726867624.87024: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp5u749u8z" to remote "/root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/AnsiballZ_systemd.py" <<< 30575 1726867624.88107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867624.88139: stderr chunk (state=3): >>><<< 30575 1726867624.88142: stdout chunk (state=3): >>><<< 30575 1726867624.88175: done transferring module to remote 30575 1726867624.88186: _low_level_execute_command(): starting 30575 1726867624.88192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/ /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/AnsiballZ_systemd.py && sleep 0' 30575 1726867624.88607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867624.88611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867624.88615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867624.88618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867624.88621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867624.88667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867624.88670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867624.88719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867624.90488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867624.90510: stderr chunk (state=3): >>><<< 30575 1726867624.90514: stdout chunk (state=3): >>><<< 30575 1726867624.90526: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867624.90529: _low_level_execute_command(): starting 30575 1726867624.90533: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/AnsiballZ_systemd.py && sleep 0' 30575 1726867624.90935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867624.90939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867624.90942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867624.90944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867624.90990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867624.90998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867624.91042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867625.20392: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10563584", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318304768", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1866228000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867625.22197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867625.22249: stderr chunk (state=3): >>><<< 30575 1726867625.22294: stdout chunk (state=3): >>><<< 30575 1726867625.22603: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10563584", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318304768", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1866228000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867625.23093: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867625.23131: _low_level_execute_command(): starting 30575 1726867625.23134: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867624.8182123-33457-269828461669704/ > /dev/null 2>&1 && sleep 0' 30575 1726867625.23951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867625.23970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867625.24067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867625.24124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867625.24152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867625.24170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867625.24202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867625.24375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867625.26254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867625.26258: stdout chunk (state=3): >>><<< 30575 1726867625.26260: stderr chunk (state=3): >>><<< 30575 1726867625.26483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867625.26487: handler run complete 30575 1726867625.26489: attempt loop complete, returning result 30575 1726867625.26491: _execute() done 30575 1726867625.26493: dumping result to json 30575 1726867625.26495: done dumping result, returning 30575 1726867625.26497: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000001286] 30575 1726867625.26499: sending task result for task 0affcac9-a3a5-e081-a588-000000001286 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867625.26901: no more pending results, returning what we have 30575 1726867625.26905: results queue empty 30575 1726867625.26906: checking for any_errors_fatal 30575 1726867625.26912: done checking for any_errors_fatal 30575 1726867625.26913: checking for max_fail_percentage 30575 1726867625.26915: done checking for max_fail_percentage 30575 1726867625.26916: checking to see if all hosts have failed and the running result is not ok 30575 1726867625.26917: done checking to see if all hosts have failed 30575 1726867625.26918: getting the remaining hosts for this loop 30575 1726867625.26920: done getting the remaining hosts for this loop 30575 1726867625.26924: getting the next task for host managed_node3 30575 1726867625.26935: done getting next task for host managed_node3 30575 1726867625.26939: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867625.26945: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867625.26959: getting variables 30575 1726867625.26961: in VariableManager get_vars() 30575 1726867625.27004: Calling all_inventory to load vars for managed_node3 30575 1726867625.27008: Calling groups_inventory to load vars for managed_node3 30575 1726867625.27010: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867625.27021: Calling all_plugins_play to load vars for managed_node3 30575 1726867625.27024: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867625.27027: Calling groups_plugins_play to load vars for managed_node3 30575 1726867625.27943: done sending task result for task 0affcac9-a3a5-e081-a588-000000001286 30575 1726867625.27947: WORKER PROCESS EXITING 30575 1726867625.30404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867625.32617: done with get_vars() 30575 1726867625.32650: done getting variables 30575 1726867625.32708: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:05 -0400 (0:00:00.628) 0:01:00.704 ****** 30575 1726867625.32750: entering _queue_task() for managed_node3/service 30575 1726867625.33216: worker is 1 (out of 1 available) 30575 1726867625.33228: exiting _queue_task() for managed_node3/service 30575 1726867625.33242: done queuing things up, now waiting for results queue to drain 30575 1726867625.33243: waiting for pending results... 30575 1726867625.33450: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867625.33617: in run() - task 0affcac9-a3a5-e081-a588-000000001287 30575 1726867625.33638: variable 'ansible_search_path' from source: unknown 30575 1726867625.33646: variable 'ansible_search_path' from source: unknown 30575 1726867625.33699: calling self._execute() 30575 1726867625.33804: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867625.33815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867625.33830: variable 'omit' from source: magic vars 30575 1726867625.34272: variable 'ansible_distribution_major_version' from source: facts 30575 1726867625.34292: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867625.34417: variable 'network_provider' from source: set_fact 30575 1726867625.34429: Evaluated conditional (network_provider == "nm"): True 30575 1726867625.34530: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867625.34634: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867625.34820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867625.37382: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867625.37455: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867625.37506: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867625.37582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867625.37586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867625.37672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867625.37718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867625.37751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867625.37801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867625.37983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867625.37987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867625.37989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867625.37992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867625.37994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867625.37997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867625.38044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867625.38073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867625.38110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867625.38155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867625.38174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867625.38333: variable 'network_connections' from source: include params 30575 1726867625.38442: variable 'interface' from source: play vars 30575 1726867625.38446: variable 'interface' from source: play vars 30575 1726867625.38499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867625.38697: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867625.38738: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867625.38783: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867625.38815: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867625.38862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867625.38898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867625.38928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867625.38959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867625.39019: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867625.39269: variable 'network_connections' from source: include params 30575 1726867625.39281: variable 'interface' from source: play vars 30575 1726867625.39349: variable 'interface' from source: play vars 30575 1726867625.39382: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867625.39391: when evaluation is False, skipping this task 30575 1726867625.39399: _execute() done 30575 1726867625.39421: dumping result to json 30575 1726867625.39424: done dumping result, returning 30575 1726867625.39431: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000001287] 30575 1726867625.39481: sending task result for task 0affcac9-a3a5-e081-a588-000000001287 30575 1726867625.39846: done sending task result for task 0affcac9-a3a5-e081-a588-000000001287 30575 1726867625.39850: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867625.39894: no more pending results, returning what we have 30575 1726867625.39897: results queue empty 30575 1726867625.39898: checking for any_errors_fatal 30575 1726867625.39913: done checking for any_errors_fatal 30575 1726867625.39914: checking for max_fail_percentage 30575 1726867625.39916: done checking for max_fail_percentage 30575 1726867625.39916: checking to see if all hosts have failed and the running result is not ok 30575 1726867625.39917: done checking to see if all hosts have failed 30575 1726867625.39917: getting the remaining hosts for this loop 30575 1726867625.39918: done getting the remaining hosts for this loop 30575 1726867625.39921: getting the next task for host managed_node3 30575 1726867625.39926: done getting next task for host managed_node3 30575 1726867625.39930: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867625.39935: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867625.39953: getting variables 30575 1726867625.39954: in VariableManager get_vars() 30575 1726867625.39984: Calling all_inventory to load vars for managed_node3 30575 1726867625.39988: Calling groups_inventory to load vars for managed_node3 30575 1726867625.39990: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867625.39996: Calling all_plugins_play to load vars for managed_node3 30575 1726867625.39998: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867625.40000: Calling groups_plugins_play to load vars for managed_node3 30575 1726867625.40875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867625.41884: done with get_vars() 30575 1726867625.41906: done getting variables 30575 1726867625.41964: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:05 -0400 (0:00:00.092) 0:01:00.797 ****** 30575 1726867625.41999: entering _queue_task() for managed_node3/service 30575 1726867625.42299: worker is 1 (out of 1 available) 30575 1726867625.42311: exiting _queue_task() for managed_node3/service 30575 1726867625.42324: done queuing things up, now waiting for results queue to drain 30575 1726867625.42326: waiting for pending results... 30575 1726867625.42664: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867625.42735: in run() - task 0affcac9-a3a5-e081-a588-000000001288 30575 1726867625.42747: variable 'ansible_search_path' from source: unknown 30575 1726867625.42751: variable 'ansible_search_path' from source: unknown 30575 1726867625.42794: calling self._execute() 30575 1726867625.42872: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867625.42876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867625.42888: variable 'omit' from source: magic vars 30575 1726867625.43166: variable 'ansible_distribution_major_version' from source: facts 30575 1726867625.43182: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867625.43259: variable 'network_provider' from source: set_fact 30575 1726867625.43265: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867625.43270: when evaluation is False, skipping this task 30575 1726867625.43272: _execute() done 30575 1726867625.43275: dumping result to json 30575 1726867625.43282: done dumping result, returning 30575 1726867625.43291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000001288] 30575 1726867625.43294: sending task result for task 0affcac9-a3a5-e081-a588-000000001288 30575 1726867625.43409: done sending task result for task 0affcac9-a3a5-e081-a588-000000001288 30575 1726867625.43412: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867625.43487: no more pending results, returning what we have 30575 1726867625.43490: results queue empty 30575 1726867625.43490: checking for any_errors_fatal 30575 1726867625.43497: done checking for any_errors_fatal 30575 1726867625.43497: checking for max_fail_percentage 30575 1726867625.43499: done checking for max_fail_percentage 30575 1726867625.43500: checking to see if all hosts have failed and the running result is not ok 30575 1726867625.43501: done checking to see if all hosts have failed 30575 1726867625.43501: getting the remaining hosts for this loop 30575 1726867625.43502: done getting the remaining hosts for this loop 30575 1726867625.43505: getting the next task for host managed_node3 30575 1726867625.43512: done getting next task for host managed_node3 30575 1726867625.43517: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867625.43528: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867625.43547: getting variables 30575 1726867625.43549: in VariableManager get_vars() 30575 1726867625.43573: Calling all_inventory to load vars for managed_node3 30575 1726867625.43574: Calling groups_inventory to load vars for managed_node3 30575 1726867625.43576: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867625.43584: Calling all_plugins_play to load vars for managed_node3 30575 1726867625.43586: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867625.43587: Calling groups_plugins_play to load vars for managed_node3 30575 1726867625.44315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867625.45639: done with get_vars() 30575 1726867625.45659: done getting variables 30575 1726867625.45718: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:05 -0400 (0:00:00.037) 0:01:00.835 ****** 30575 1726867625.45751: entering _queue_task() for managed_node3/copy 30575 1726867625.46209: worker is 1 (out of 1 available) 30575 1726867625.46222: exiting _queue_task() for managed_node3/copy 30575 1726867625.46239: done queuing things up, now waiting for results queue to drain 30575 1726867625.46241: waiting for pending results... 30575 1726867625.46480: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867625.46544: in run() - task 0affcac9-a3a5-e081-a588-000000001289 30575 1726867625.46563: variable 'ansible_search_path' from source: unknown 30575 1726867625.46575: variable 'ansible_search_path' from source: unknown 30575 1726867625.46622: calling self._execute() 30575 1726867625.46729: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867625.46741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867625.46756: variable 'omit' from source: magic vars 30575 1726867625.47152: variable 'ansible_distribution_major_version' from source: facts 30575 1726867625.47282: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867625.47290: variable 'network_provider' from source: set_fact 30575 1726867625.47303: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867625.47309: when evaluation is False, skipping this task 30575 1726867625.47319: _execute() done 30575 1726867625.47327: dumping result to json 30575 1726867625.47334: done dumping result, returning 30575 1726867625.47346: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000001289] 30575 1726867625.47356: sending task result for task 0affcac9-a3a5-e081-a588-000000001289 30575 1726867625.47583: done sending task result for task 0affcac9-a3a5-e081-a588-000000001289 30575 1726867625.47586: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867625.47640: no more pending results, returning what we have 30575 1726867625.47644: results queue empty 30575 1726867625.47645: checking for any_errors_fatal 30575 1726867625.47653: done checking for any_errors_fatal 30575 1726867625.47654: checking for max_fail_percentage 30575 1726867625.47656: done checking for max_fail_percentage 30575 1726867625.47657: checking to see if all hosts have failed and the running result is not ok 30575 1726867625.47659: done checking to see if all hosts have failed 30575 1726867625.47659: getting the remaining hosts for this loop 30575 1726867625.47661: done getting the remaining hosts for this loop 30575 1726867625.47665: getting the next task for host managed_node3 30575 1726867625.47675: done getting next task for host managed_node3 30575 1726867625.47681: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867625.47687: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867625.47718: getting variables 30575 1726867625.47720: in VariableManager get_vars() 30575 1726867625.47760: Calling all_inventory to load vars for managed_node3 30575 1726867625.47762: Calling groups_inventory to load vars for managed_node3 30575 1726867625.47765: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867625.48062: Calling all_plugins_play to load vars for managed_node3 30575 1726867625.48067: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867625.48071: Calling groups_plugins_play to load vars for managed_node3 30575 1726867625.50169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867625.51785: done with get_vars() 30575 1726867625.51806: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:05 -0400 (0:00:00.061) 0:01:00.896 ****** 30575 1726867625.51891: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867625.52179: worker is 1 (out of 1 available) 30575 1726867625.52190: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867625.52201: done queuing things up, now waiting for results queue to drain 30575 1726867625.52203: waiting for pending results... 30575 1726867625.52598: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867625.52631: in run() - task 0affcac9-a3a5-e081-a588-00000000128a 30575 1726867625.52650: variable 'ansible_search_path' from source: unknown 30575 1726867625.52656: variable 'ansible_search_path' from source: unknown 30575 1726867625.52702: calling self._execute() 30575 1726867625.52799: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867625.52815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867625.52829: variable 'omit' from source: magic vars 30575 1726867625.53208: variable 'ansible_distribution_major_version' from source: facts 30575 1726867625.53227: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867625.53240: variable 'omit' from source: magic vars 30575 1726867625.53315: variable 'omit' from source: magic vars 30575 1726867625.53476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867625.55672: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867625.55750: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867625.55847: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867625.55850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867625.55863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867625.55952: variable 'network_provider' from source: set_fact 30575 1726867625.56100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867625.56136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867625.56165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867625.56217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867625.56237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867625.56320: variable 'omit' from source: magic vars 30575 1726867625.56482: variable 'omit' from source: magic vars 30575 1726867625.56546: variable 'network_connections' from source: include params 30575 1726867625.56562: variable 'interface' from source: play vars 30575 1726867625.56633: variable 'interface' from source: play vars 30575 1726867625.56787: variable 'omit' from source: magic vars 30575 1726867625.56799: variable '__lsr_ansible_managed' from source: task vars 30575 1726867625.56869: variable '__lsr_ansible_managed' from source: task vars 30575 1726867625.57072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867625.57373: Loaded config def from plugin (lookup/template) 30575 1726867625.57378: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867625.57381: File lookup term: get_ansible_managed.j2 30575 1726867625.57383: variable 'ansible_search_path' from source: unknown 30575 1726867625.57386: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867625.57390: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867625.57393: variable 'ansible_search_path' from source: unknown 30575 1726867625.66333: variable 'ansible_managed' from source: unknown 30575 1726867625.66587: variable 'omit' from source: magic vars 30575 1726867625.66630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867625.66885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867625.66889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867625.66891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867625.66893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867625.66899: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867625.66908: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867625.66993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867625.67322: Set connection var ansible_pipelining to False 30575 1726867625.67325: Set connection var ansible_shell_type to sh 30575 1726867625.67328: Set connection var ansible_shell_executable to /bin/sh 30575 1726867625.67330: Set connection var ansible_timeout to 10 30575 1726867625.67332: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867625.67334: Set connection var ansible_connection to ssh 30575 1726867625.67336: variable 'ansible_shell_executable' from source: unknown 30575 1726867625.67338: variable 'ansible_connection' from source: unknown 30575 1726867625.67340: variable 'ansible_module_compression' from source: unknown 30575 1726867625.67342: variable 'ansible_shell_type' from source: unknown 30575 1726867625.67344: variable 'ansible_shell_executable' from source: unknown 30575 1726867625.67346: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867625.67348: variable 'ansible_pipelining' from source: unknown 30575 1726867625.67350: variable 'ansible_timeout' from source: unknown 30575 1726867625.67352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867625.67663: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867625.67689: variable 'omit' from source: magic vars 30575 1726867625.67700: starting attempt loop 30575 1726867625.67709: running the handler 30575 1726867625.67732: _low_level_execute_command(): starting 30575 1726867625.67767: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867625.69208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867625.69221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867625.69238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867625.69254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867625.69270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867625.69394: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867625.69440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867625.69593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867625.69633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867625.71384: stdout chunk (state=3): >>>/root <<< 30575 1726867625.71580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867625.71584: stdout chunk (state=3): >>><<< 30575 1726867625.71593: stderr chunk (state=3): >>><<< 30575 1726867625.71611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867625.71688: _low_level_execute_command(): starting 30575 1726867625.71695: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294 `" && echo ansible-tmp-1726867625.7161129-33500-140647799453294="` echo /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294 `" ) && sleep 0' 30575 1726867625.72682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867625.72699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867625.72731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867625.72839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867625.72918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867625.73092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867625.73167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867625.75103: stdout chunk (state=3): >>>ansible-tmp-1726867625.7161129-33500-140647799453294=/root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294 <<< 30575 1726867625.75262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867625.75265: stdout chunk (state=3): >>><<< 30575 1726867625.75268: stderr chunk (state=3): >>><<< 30575 1726867625.75271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867625.7161129-33500-140647799453294=/root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867625.75369: variable 'ansible_module_compression' from source: unknown 30575 1726867625.75388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867625.75425: variable 'ansible_facts' from source: unknown 30575 1726867625.75550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/AnsiballZ_network_connections.py 30575 1726867625.75719: Sending initial data 30575 1726867625.75722: Sent initial data (168 bytes) 30575 1726867625.76390: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867625.76415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867625.76433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867625.76534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867625.78056: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867625.78118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867625.78186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmppy8ltyyv /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/AnsiballZ_network_connections.py <<< 30575 1726867625.78204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/AnsiballZ_network_connections.py" <<< 30575 1726867625.78235: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmppy8ltyyv" to remote "/root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/AnsiballZ_network_connections.py" <<< 30575 1726867625.79283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867625.79287: stdout chunk (state=3): >>><<< 30575 1726867625.79289: stderr chunk (state=3): >>><<< 30575 1726867625.79398: done transferring module to remote 30575 1726867625.79401: _low_level_execute_command(): starting 30575 1726867625.79404: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/ /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/AnsiballZ_network_connections.py && sleep 0' 30575 1726867625.79936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867625.79950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867625.79964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867625.80016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867625.80097: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867625.80113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867625.80129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867625.80249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867625.82058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867625.82062: stdout chunk (state=3): >>><<< 30575 1726867625.82067: stderr chunk (state=3): >>><<< 30575 1726867625.82089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867625.82101: _low_level_execute_command(): starting 30575 1726867625.82111: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/AnsiballZ_network_connections.py && sleep 0' 30575 1726867625.82704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867625.82726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867625.82729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867625.82791: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867625.82829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867625.82849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867625.82863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867625.82940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.08040: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867626.09885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867626.09889: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867626.09891: stdout chunk (state=3): >>><<< 30575 1726867626.09893: stderr chunk (state=3): >>><<< 30575 1726867626.09896: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867626.09916: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867626.09929: _low_level_execute_command(): starting 30575 1726867626.09934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867625.7161129-33500-140647799453294/ > /dev/null 2>&1 && sleep 0' 30575 1726867626.10591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867626.10682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867626.10686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867626.10688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867626.10690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867626.10692: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867626.10694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.10696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867626.10698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867626.10755: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.10766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867626.10779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867626.10802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867626.10879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.12782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867626.12785: stderr chunk (state=3): >>><<< 30575 1726867626.12787: stdout chunk (state=3): >>><<< 30575 1726867626.12799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867626.12805: handler run complete 30575 1726867626.12837: attempt loop complete, returning result 30575 1726867626.12840: _execute() done 30575 1726867626.12843: dumping result to json 30575 1726867626.12847: done dumping result, returning 30575 1726867626.12858: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-00000000128a] 30575 1726867626.12863: sending task result for task 0affcac9-a3a5-e081-a588-00000000128a 30575 1726867626.12979: done sending task result for task 0affcac9-a3a5-e081-a588-00000000128a 30575 1726867626.12982: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219 skipped because already active 30575 1726867626.13218: no more pending results, returning what we have 30575 1726867626.13222: results queue empty 30575 1726867626.13223: checking for any_errors_fatal 30575 1726867626.13294: done checking for any_errors_fatal 30575 1726867626.13295: checking for max_fail_percentage 30575 1726867626.13376: done checking for max_fail_percentage 30575 1726867626.13379: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.13381: done checking to see if all hosts have failed 30575 1726867626.13381: getting the remaining hosts for this loop 30575 1726867626.13383: done getting the remaining hosts for this loop 30575 1726867626.13386: getting the next task for host managed_node3 30575 1726867626.13394: done getting next task for host managed_node3 30575 1726867626.13430: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867626.13435: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.13449: getting variables 30575 1726867626.13451: in VariableManager get_vars() 30575 1726867626.13487: Calling all_inventory to load vars for managed_node3 30575 1726867626.13489: Calling groups_inventory to load vars for managed_node3 30575 1726867626.13491: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.13500: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.13502: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.13723: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.15120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.17731: done with get_vars() 30575 1726867626.17756: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:06 -0400 (0:00:00.659) 0:01:01.556 ****** 30575 1726867626.17853: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867626.18221: worker is 1 (out of 1 available) 30575 1726867626.18234: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867626.18249: done queuing things up, now waiting for results queue to drain 30575 1726867626.18250: waiting for pending results... 30575 1726867626.18701: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867626.18718: in run() - task 0affcac9-a3a5-e081-a588-00000000128b 30575 1726867626.18738: variable 'ansible_search_path' from source: unknown 30575 1726867626.18744: variable 'ansible_search_path' from source: unknown 30575 1726867626.18785: calling self._execute() 30575 1726867626.18885: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.18898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.18917: variable 'omit' from source: magic vars 30575 1726867626.19323: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.19345: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.19483: variable 'network_state' from source: role '' defaults 30575 1726867626.19563: Evaluated conditional (network_state != {}): False 30575 1726867626.19567: when evaluation is False, skipping this task 30575 1726867626.19569: _execute() done 30575 1726867626.19571: dumping result to json 30575 1726867626.19573: done dumping result, returning 30575 1726867626.19575: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-00000000128b] 30575 1726867626.19579: sending task result for task 0affcac9-a3a5-e081-a588-00000000128b skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867626.19837: no more pending results, returning what we have 30575 1726867626.19843: results queue empty 30575 1726867626.19844: checking for any_errors_fatal 30575 1726867626.19860: done checking for any_errors_fatal 30575 1726867626.19861: checking for max_fail_percentage 30575 1726867626.19863: done checking for max_fail_percentage 30575 1726867626.19864: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.19865: done checking to see if all hosts have failed 30575 1726867626.19866: getting the remaining hosts for this loop 30575 1726867626.19867: done getting the remaining hosts for this loop 30575 1726867626.19872: getting the next task for host managed_node3 30575 1726867626.19884: done getting next task for host managed_node3 30575 1726867626.19889: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867626.19895: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.19929: getting variables 30575 1726867626.19931: in VariableManager get_vars() 30575 1726867626.19973: Calling all_inventory to load vars for managed_node3 30575 1726867626.19976: Calling groups_inventory to load vars for managed_node3 30575 1726867626.20095: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.20107: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.20110: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.20113: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.20712: done sending task result for task 0affcac9-a3a5-e081-a588-00000000128b 30575 1726867626.20716: WORKER PROCESS EXITING 30575 1726867626.21604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.23200: done with get_vars() 30575 1726867626.23232: done getting variables 30575 1726867626.23301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:06 -0400 (0:00:00.054) 0:01:01.610 ****** 30575 1726867626.23346: entering _queue_task() for managed_node3/debug 30575 1726867626.23890: worker is 1 (out of 1 available) 30575 1726867626.23901: exiting _queue_task() for managed_node3/debug 30575 1726867626.23913: done queuing things up, now waiting for results queue to drain 30575 1726867626.23915: waiting for pending results... 30575 1726867626.24164: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867626.24337: in run() - task 0affcac9-a3a5-e081-a588-00000000128c 30575 1726867626.24358: variable 'ansible_search_path' from source: unknown 30575 1726867626.24366: variable 'ansible_search_path' from source: unknown 30575 1726867626.24413: calling self._execute() 30575 1726867626.24542: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.24555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.24572: variable 'omit' from source: magic vars 30575 1726867626.25025: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.25050: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.25074: variable 'omit' from source: magic vars 30575 1726867626.25162: variable 'omit' from source: magic vars 30575 1726867626.25214: variable 'omit' from source: magic vars 30575 1726867626.25382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867626.25433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867626.25459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867626.25512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867626.25515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867626.25545: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867626.25554: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.25562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.25660: Set connection var ansible_pipelining to False 30575 1726867626.25681: Set connection var ansible_shell_type to sh 30575 1726867626.25684: Set connection var ansible_shell_executable to /bin/sh 30575 1726867626.25686: Set connection var ansible_timeout to 10 30575 1726867626.25730: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867626.25733: Set connection var ansible_connection to ssh 30575 1726867626.25739: variable 'ansible_shell_executable' from source: unknown 30575 1726867626.25746: variable 'ansible_connection' from source: unknown 30575 1726867626.25753: variable 'ansible_module_compression' from source: unknown 30575 1726867626.25759: variable 'ansible_shell_type' from source: unknown 30575 1726867626.25765: variable 'ansible_shell_executable' from source: unknown 30575 1726867626.25771: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.25839: variable 'ansible_pipelining' from source: unknown 30575 1726867626.25842: variable 'ansible_timeout' from source: unknown 30575 1726867626.25845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.25931: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867626.25953: variable 'omit' from source: magic vars 30575 1726867626.25961: starting attempt loop 30575 1726867626.25967: running the handler 30575 1726867626.26102: variable '__network_connections_result' from source: set_fact 30575 1726867626.26164: handler run complete 30575 1726867626.26382: attempt loop complete, returning result 30575 1726867626.26385: _execute() done 30575 1726867626.26388: dumping result to json 30575 1726867626.26390: done dumping result, returning 30575 1726867626.26393: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-00000000128c] 30575 1726867626.26395: sending task result for task 0affcac9-a3a5-e081-a588-00000000128c 30575 1726867626.26461: done sending task result for task 0affcac9-a3a5-e081-a588-00000000128c 30575 1726867626.26464: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219 skipped because already active" ] } 30575 1726867626.26536: no more pending results, returning what we have 30575 1726867626.26541: results queue empty 30575 1726867626.26542: checking for any_errors_fatal 30575 1726867626.26549: done checking for any_errors_fatal 30575 1726867626.26550: checking for max_fail_percentage 30575 1726867626.26551: done checking for max_fail_percentage 30575 1726867626.26552: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.26553: done checking to see if all hosts have failed 30575 1726867626.26554: getting the remaining hosts for this loop 30575 1726867626.26556: done getting the remaining hosts for this loop 30575 1726867626.26560: getting the next task for host managed_node3 30575 1726867626.26568: done getting next task for host managed_node3 30575 1726867626.26572: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867626.26579: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.26593: getting variables 30575 1726867626.26595: in VariableManager get_vars() 30575 1726867626.26633: Calling all_inventory to load vars for managed_node3 30575 1726867626.26636: Calling groups_inventory to load vars for managed_node3 30575 1726867626.26638: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.26649: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.26652: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.26654: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.28312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.29824: done with get_vars() 30575 1726867626.29843: done getting variables 30575 1726867626.29901: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:06 -0400 (0:00:00.066) 0:01:01.677 ****** 30575 1726867626.29946: entering _queue_task() for managed_node3/debug 30575 1726867626.30390: worker is 1 (out of 1 available) 30575 1726867626.30401: exiting _queue_task() for managed_node3/debug 30575 1726867626.30412: done queuing things up, now waiting for results queue to drain 30575 1726867626.30414: waiting for pending results... 30575 1726867626.30650: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867626.30729: in run() - task 0affcac9-a3a5-e081-a588-00000000128d 30575 1726867626.30784: variable 'ansible_search_path' from source: unknown 30575 1726867626.30792: variable 'ansible_search_path' from source: unknown 30575 1726867626.30808: calling self._execute() 30575 1726867626.30922: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.30964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.30967: variable 'omit' from source: magic vars 30575 1726867626.31344: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.31358: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.31367: variable 'omit' from source: magic vars 30575 1726867626.31479: variable 'omit' from source: magic vars 30575 1726867626.31487: variable 'omit' from source: magic vars 30575 1726867626.31538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867626.31590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867626.31621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867626.31644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867626.31670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867626.31711: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867626.31772: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.31775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.31850: Set connection var ansible_pipelining to False 30575 1726867626.31883: Set connection var ansible_shell_type to sh 30575 1726867626.31886: Set connection var ansible_shell_executable to /bin/sh 30575 1726867626.31888: Set connection var ansible_timeout to 10 30575 1726867626.31890: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867626.31923: Set connection var ansible_connection to ssh 30575 1726867626.31952: variable 'ansible_shell_executable' from source: unknown 30575 1726867626.31961: variable 'ansible_connection' from source: unknown 30575 1726867626.31969: variable 'ansible_module_compression' from source: unknown 30575 1726867626.31994: variable 'ansible_shell_type' from source: unknown 30575 1726867626.32109: variable 'ansible_shell_executable' from source: unknown 30575 1726867626.32112: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.32115: variable 'ansible_pipelining' from source: unknown 30575 1726867626.32117: variable 'ansible_timeout' from source: unknown 30575 1726867626.32120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.32219: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867626.32241: variable 'omit' from source: magic vars 30575 1726867626.32252: starting attempt loop 30575 1726867626.32259: running the handler 30575 1726867626.32323: variable '__network_connections_result' from source: set_fact 30575 1726867626.32418: variable '__network_connections_result' from source: set_fact 30575 1726867626.32561: handler run complete 30575 1726867626.32598: attempt loop complete, returning result 30575 1726867626.32606: _execute() done 30575 1726867626.32612: dumping result to json 30575 1726867626.32621: done dumping result, returning 30575 1726867626.32668: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-00000000128d] 30575 1726867626.32671: sending task result for task 0affcac9-a3a5-e081-a588-00000000128d 30575 1726867626.32928: done sending task result for task 0affcac9-a3a5-e081-a588-00000000128d 30575 1726867626.32931: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 12e4c575-fa21-4cd0-afc7-2cb6b45b6219 skipped because already active" ] } } 30575 1726867626.33031: no more pending results, returning what we have 30575 1726867626.33036: results queue empty 30575 1726867626.33037: checking for any_errors_fatal 30575 1726867626.33044: done checking for any_errors_fatal 30575 1726867626.33044: checking for max_fail_percentage 30575 1726867626.33046: done checking for max_fail_percentage 30575 1726867626.33047: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.33049: done checking to see if all hosts have failed 30575 1726867626.33049: getting the remaining hosts for this loop 30575 1726867626.33051: done getting the remaining hosts for this loop 30575 1726867626.33055: getting the next task for host managed_node3 30575 1726867626.33064: done getting next task for host managed_node3 30575 1726867626.33224: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867626.33241: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.33268: getting variables 30575 1726867626.33270: in VariableManager get_vars() 30575 1726867626.33412: Calling all_inventory to load vars for managed_node3 30575 1726867626.33414: Calling groups_inventory to load vars for managed_node3 30575 1726867626.33422: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.33431: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.33433: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.33436: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.34575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.35458: done with get_vars() 30575 1726867626.35473: done getting variables 30575 1726867626.35518: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:06 -0400 (0:00:00.055) 0:01:01.732 ****** 30575 1726867626.35543: entering _queue_task() for managed_node3/debug 30575 1726867626.35772: worker is 1 (out of 1 available) 30575 1726867626.35788: exiting _queue_task() for managed_node3/debug 30575 1726867626.35801: done queuing things up, now waiting for results queue to drain 30575 1726867626.35803: waiting for pending results... 30575 1726867626.36005: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867626.36198: in run() - task 0affcac9-a3a5-e081-a588-00000000128e 30575 1726867626.36202: variable 'ansible_search_path' from source: unknown 30575 1726867626.36205: variable 'ansible_search_path' from source: unknown 30575 1726867626.36208: calling self._execute() 30575 1726867626.36483: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.36486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.36489: variable 'omit' from source: magic vars 30575 1726867626.36680: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.36696: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.36815: variable 'network_state' from source: role '' defaults 30575 1726867626.36833: Evaluated conditional (network_state != {}): False 30575 1726867626.36841: when evaluation is False, skipping this task 30575 1726867626.36849: _execute() done 30575 1726867626.36856: dumping result to json 30575 1726867626.36865: done dumping result, returning 30575 1726867626.36875: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-00000000128e] 30575 1726867626.36887: sending task result for task 0affcac9-a3a5-e081-a588-00000000128e skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867626.37029: no more pending results, returning what we have 30575 1726867626.37034: results queue empty 30575 1726867626.37037: checking for any_errors_fatal 30575 1726867626.37045: done checking for any_errors_fatal 30575 1726867626.37046: checking for max_fail_percentage 30575 1726867626.37047: done checking for max_fail_percentage 30575 1726867626.37048: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.37050: done checking to see if all hosts have failed 30575 1726867626.37050: getting the remaining hosts for this loop 30575 1726867626.37052: done getting the remaining hosts for this loop 30575 1726867626.37061: getting the next task for host managed_node3 30575 1726867626.37070: done getting next task for host managed_node3 30575 1726867626.37075: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867626.37083: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.37098: done sending task result for task 0affcac9-a3a5-e081-a588-00000000128e 30575 1726867626.37101: WORKER PROCESS EXITING 30575 1726867626.37120: getting variables 30575 1726867626.37122: in VariableManager get_vars() 30575 1726867626.37158: Calling all_inventory to load vars for managed_node3 30575 1726867626.37161: Calling groups_inventory to load vars for managed_node3 30575 1726867626.37164: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.37175: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.37180: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.37183: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.38107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.38960: done with get_vars() 30575 1726867626.38975: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:06 -0400 (0:00:00.035) 0:01:01.768 ****** 30575 1726867626.39058: entering _queue_task() for managed_node3/ping 30575 1726867626.39337: worker is 1 (out of 1 available) 30575 1726867626.39350: exiting _queue_task() for managed_node3/ping 30575 1726867626.39363: done queuing things up, now waiting for results queue to drain 30575 1726867626.39365: waiting for pending results... 30575 1726867626.39651: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867626.39805: in run() - task 0affcac9-a3a5-e081-a588-00000000128f 30575 1726867626.39827: variable 'ansible_search_path' from source: unknown 30575 1726867626.39837: variable 'ansible_search_path' from source: unknown 30575 1726867626.39982: calling self._execute() 30575 1726867626.39986: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.39995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.40009: variable 'omit' from source: magic vars 30575 1726867626.40383: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.40400: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.40411: variable 'omit' from source: magic vars 30575 1726867626.40483: variable 'omit' from source: magic vars 30575 1726867626.40521: variable 'omit' from source: magic vars 30575 1726867626.40570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867626.40614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867626.40645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867626.40659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867626.40670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867626.40697: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867626.40702: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.40705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.40782: Set connection var ansible_pipelining to False 30575 1726867626.40786: Set connection var ansible_shell_type to sh 30575 1726867626.40792: Set connection var ansible_shell_executable to /bin/sh 30575 1726867626.40798: Set connection var ansible_timeout to 10 30575 1726867626.40807: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867626.40810: Set connection var ansible_connection to ssh 30575 1726867626.40828: variable 'ansible_shell_executable' from source: unknown 30575 1726867626.40831: variable 'ansible_connection' from source: unknown 30575 1726867626.40834: variable 'ansible_module_compression' from source: unknown 30575 1726867626.40836: variable 'ansible_shell_type' from source: unknown 30575 1726867626.40838: variable 'ansible_shell_executable' from source: unknown 30575 1726867626.40840: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.40842: variable 'ansible_pipelining' from source: unknown 30575 1726867626.40845: variable 'ansible_timeout' from source: unknown 30575 1726867626.40849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.40995: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867626.41003: variable 'omit' from source: magic vars 30575 1726867626.41009: starting attempt loop 30575 1726867626.41011: running the handler 30575 1726867626.41026: _low_level_execute_command(): starting 30575 1726867626.41032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867626.41522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867626.41526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.41530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.41585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867626.41592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867626.41644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.43338: stdout chunk (state=3): >>>/root <<< 30575 1726867626.43512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867626.43526: stderr chunk (state=3): >>><<< 30575 1726867626.43529: stdout chunk (state=3): >>><<< 30575 1726867626.43572: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867626.43589: _low_level_execute_command(): starting 30575 1726867626.43593: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595 `" && echo ansible-tmp-1726867626.4356017-33541-199624950595595="` echo /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595 `" ) && sleep 0' 30575 1726867626.44035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867626.44039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867626.44041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867626.44050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.44099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867626.44108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867626.44150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.46032: stdout chunk (state=3): >>>ansible-tmp-1726867626.4356017-33541-199624950595595=/root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595 <<< 30575 1726867626.46138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867626.46159: stderr chunk (state=3): >>><<< 30575 1726867626.46165: stdout chunk (state=3): >>><<< 30575 1726867626.46189: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867626.4356017-33541-199624950595595=/root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867626.46225: variable 'ansible_module_compression' from source: unknown 30575 1726867626.46257: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867626.46288: variable 'ansible_facts' from source: unknown 30575 1726867626.46345: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/AnsiballZ_ping.py 30575 1726867626.46438: Sending initial data 30575 1726867626.46442: Sent initial data (153 bytes) 30575 1726867626.46855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867626.46859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.46862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867626.46865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.46918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867626.46925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867626.46967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.48522: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867626.48526: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867626.48563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867626.48609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvdnx9qg5 /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/AnsiballZ_ping.py <<< 30575 1726867626.48614: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/AnsiballZ_ping.py" <<< 30575 1726867626.48653: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvdnx9qg5" to remote "/root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/AnsiballZ_ping.py" <<< 30575 1726867626.48656: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/AnsiballZ_ping.py" <<< 30575 1726867626.49175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867626.49206: stderr chunk (state=3): >>><<< 30575 1726867626.49210: stdout chunk (state=3): >>><<< 30575 1726867626.49252: done transferring module to remote 30575 1726867626.49262: _low_level_execute_command(): starting 30575 1726867626.49265: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/ /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/AnsiballZ_ping.py && sleep 0' 30575 1726867626.49644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867626.49647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867626.49685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.49736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867626.49739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867626.49748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867626.49793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.51544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867626.51564: stderr chunk (state=3): >>><<< 30575 1726867626.51568: stdout chunk (state=3): >>><<< 30575 1726867626.51581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867626.51585: _low_level_execute_command(): starting 30575 1726867626.51588: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/AnsiballZ_ping.py && sleep 0' 30575 1726867626.51995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867626.51999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.52001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867626.52003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867626.52005: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.52049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867626.52063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867626.52122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.67083: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867626.68365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867626.68395: stderr chunk (state=3): >>><<< 30575 1726867626.68398: stdout chunk (state=3): >>><<< 30575 1726867626.68415: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867626.68435: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867626.68443: _low_level_execute_command(): starting 30575 1726867626.68448: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867626.4356017-33541-199624950595595/ > /dev/null 2>&1 && sleep 0' 30575 1726867626.68925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867626.68928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867626.68930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867626.68932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867626.68936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867626.68983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867626.68990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867626.68992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867626.69039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867626.70910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867626.70948: stderr chunk (state=3): >>><<< 30575 1726867626.70953: stdout chunk (state=3): >>><<< 30575 1726867626.71184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867626.71194: handler run complete 30575 1726867626.71197: attempt loop complete, returning result 30575 1726867626.71200: _execute() done 30575 1726867626.71202: dumping result to json 30575 1726867626.71204: done dumping result, returning 30575 1726867626.71206: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-00000000128f] 30575 1726867626.71208: sending task result for task 0affcac9-a3a5-e081-a588-00000000128f 30575 1726867626.71286: done sending task result for task 0affcac9-a3a5-e081-a588-00000000128f 30575 1726867626.71291: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867626.71367: no more pending results, returning what we have 30575 1726867626.71371: results queue empty 30575 1726867626.71372: checking for any_errors_fatal 30575 1726867626.71381: done checking for any_errors_fatal 30575 1726867626.71382: checking for max_fail_percentage 30575 1726867626.71383: done checking for max_fail_percentage 30575 1726867626.71384: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.71385: done checking to see if all hosts have failed 30575 1726867626.71386: getting the remaining hosts for this loop 30575 1726867626.71391: done getting the remaining hosts for this loop 30575 1726867626.71395: getting the next task for host managed_node3 30575 1726867626.71409: done getting next task for host managed_node3 30575 1726867626.71411: ^ task is: TASK: meta (role_complete) 30575 1726867626.71416: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.71429: getting variables 30575 1726867626.71430: in VariableManager get_vars() 30575 1726867626.71472: Calling all_inventory to load vars for managed_node3 30575 1726867626.71474: Calling groups_inventory to load vars for managed_node3 30575 1726867626.71476: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.71672: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.71675: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.71681: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.73032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.74636: done with get_vars() 30575 1726867626.74652: done getting variables 30575 1726867626.74717: done queuing things up, now waiting for results queue to drain 30575 1726867626.74719: results queue empty 30575 1726867626.74719: checking for any_errors_fatal 30575 1726867626.74721: done checking for any_errors_fatal 30575 1726867626.74721: checking for max_fail_percentage 30575 1726867626.74722: done checking for max_fail_percentage 30575 1726867626.74722: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.74723: done checking to see if all hosts have failed 30575 1726867626.74723: getting the remaining hosts for this loop 30575 1726867626.74724: done getting the remaining hosts for this loop 30575 1726867626.74726: getting the next task for host managed_node3 30575 1726867626.74729: done getting next task for host managed_node3 30575 1726867626.74731: ^ task is: TASK: Test 30575 1726867626.74732: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.74734: getting variables 30575 1726867626.74735: in VariableManager get_vars() 30575 1726867626.74742: Calling all_inventory to load vars for managed_node3 30575 1726867626.74743: Calling groups_inventory to load vars for managed_node3 30575 1726867626.74745: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.74748: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.74749: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.74751: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.75392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.76235: done with get_vars() 30575 1726867626.76248: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:27:06 -0400 (0:00:00.372) 0:01:02.140 ****** 30575 1726867626.76298: entering _queue_task() for managed_node3/include_tasks 30575 1726867626.76549: worker is 1 (out of 1 available) 30575 1726867626.76563: exiting _queue_task() for managed_node3/include_tasks 30575 1726867626.76576: done queuing things up, now waiting for results queue to drain 30575 1726867626.76579: waiting for pending results... 30575 1726867626.76756: running TaskExecutor() for managed_node3/TASK: Test 30575 1726867626.76842: in run() - task 0affcac9-a3a5-e081-a588-000000001009 30575 1726867626.76852: variable 'ansible_search_path' from source: unknown 30575 1726867626.76855: variable 'ansible_search_path' from source: unknown 30575 1726867626.76897: variable 'lsr_test' from source: include params 30575 1726867626.77061: variable 'lsr_test' from source: include params 30575 1726867626.77120: variable 'omit' from source: magic vars 30575 1726867626.77216: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.77219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.77229: variable 'omit' from source: magic vars 30575 1726867626.77402: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.77409: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.77417: variable 'item' from source: unknown 30575 1726867626.77462: variable 'item' from source: unknown 30575 1726867626.77485: variable 'item' from source: unknown 30575 1726867626.77527: variable 'item' from source: unknown 30575 1726867626.77660: dumping result to json 30575 1726867626.77663: done dumping result, returning 30575 1726867626.77665: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-e081-a588-000000001009] 30575 1726867626.77667: sending task result for task 0affcac9-a3a5-e081-a588-000000001009 30575 1726867626.77709: done sending task result for task 0affcac9-a3a5-e081-a588-000000001009 30575 1726867626.77714: WORKER PROCESS EXITING 30575 1726867626.77783: no more pending results, returning what we have 30575 1726867626.77787: in VariableManager get_vars() 30575 1726867626.77826: Calling all_inventory to load vars for managed_node3 30575 1726867626.77829: Calling groups_inventory to load vars for managed_node3 30575 1726867626.77831: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.77840: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.77842: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.77845: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.78703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.79542: done with get_vars() 30575 1726867626.79556: variable 'ansible_search_path' from source: unknown 30575 1726867626.79557: variable 'ansible_search_path' from source: unknown 30575 1726867626.79584: we have included files to process 30575 1726867626.79585: generating all_blocks data 30575 1726867626.79587: done generating all_blocks data 30575 1726867626.79591: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30575 1726867626.79592: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30575 1726867626.79593: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30575 1726867626.79717: done processing included file 30575 1726867626.79719: iterating over new_blocks loaded from include file 30575 1726867626.79720: in VariableManager get_vars() 30575 1726867626.79730: done with get_vars() 30575 1726867626.79731: filtering new block on tags 30575 1726867626.79747: done filtering new block on tags 30575 1726867626.79748: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node3 => (item=tasks/remove_profile.yml) 30575 1726867626.79752: extending task lists for all hosts with included blocks 30575 1726867626.80235: done extending task lists 30575 1726867626.80236: done processing included files 30575 1726867626.80236: results queue empty 30575 1726867626.80237: checking for any_errors_fatal 30575 1726867626.80238: done checking for any_errors_fatal 30575 1726867626.80238: checking for max_fail_percentage 30575 1726867626.80239: done checking for max_fail_percentage 30575 1726867626.80240: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.80240: done checking to see if all hosts have failed 30575 1726867626.80241: getting the remaining hosts for this loop 30575 1726867626.80242: done getting the remaining hosts for this loop 30575 1726867626.80243: getting the next task for host managed_node3 30575 1726867626.80246: done getting next task for host managed_node3 30575 1726867626.80248: ^ task is: TASK: Include network role 30575 1726867626.80250: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.80251: getting variables 30575 1726867626.80252: in VariableManager get_vars() 30575 1726867626.80259: Calling all_inventory to load vars for managed_node3 30575 1726867626.80260: Calling groups_inventory to load vars for managed_node3 30575 1726867626.80262: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.80266: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.80267: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.80269: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.80919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.81759: done with get_vars() 30575 1726867626.81773: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 17:27:06 -0400 (0:00:00.055) 0:01:02.195 ****** 30575 1726867626.81840: entering _queue_task() for managed_node3/include_role 30575 1726867626.82092: worker is 1 (out of 1 available) 30575 1726867626.82105: exiting _queue_task() for managed_node3/include_role 30575 1726867626.82121: done queuing things up, now waiting for results queue to drain 30575 1726867626.82122: waiting for pending results... 30575 1726867626.82299: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867626.82373: in run() - task 0affcac9-a3a5-e081-a588-0000000013e8 30575 1726867626.82386: variable 'ansible_search_path' from source: unknown 30575 1726867626.82390: variable 'ansible_search_path' from source: unknown 30575 1726867626.82421: calling self._execute() 30575 1726867626.82494: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.82498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.82506: variable 'omit' from source: magic vars 30575 1726867626.82782: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.82794: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.82798: _execute() done 30575 1726867626.82801: dumping result to json 30575 1726867626.82806: done dumping result, returning 30575 1726867626.82814: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-0000000013e8] 30575 1726867626.82818: sending task result for task 0affcac9-a3a5-e081-a588-0000000013e8 30575 1726867626.82916: done sending task result for task 0affcac9-a3a5-e081-a588-0000000013e8 30575 1726867626.82920: WORKER PROCESS EXITING 30575 1726867626.82945: no more pending results, returning what we have 30575 1726867626.82950: in VariableManager get_vars() 30575 1726867626.82993: Calling all_inventory to load vars for managed_node3 30575 1726867626.82995: Calling groups_inventory to load vars for managed_node3 30575 1726867626.82999: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.83015: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.83018: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.83021: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.87324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.88158: done with get_vars() 30575 1726867626.88171: variable 'ansible_search_path' from source: unknown 30575 1726867626.88172: variable 'ansible_search_path' from source: unknown 30575 1726867626.88251: variable 'omit' from source: magic vars 30575 1726867626.88273: variable 'omit' from source: magic vars 30575 1726867626.88283: variable 'omit' from source: magic vars 30575 1726867626.88285: we have included files to process 30575 1726867626.88285: generating all_blocks data 30575 1726867626.88286: done generating all_blocks data 30575 1726867626.88287: processing included file: fedora.linux_system_roles.network 30575 1726867626.88300: in VariableManager get_vars() 30575 1726867626.88309: done with get_vars() 30575 1726867626.88326: in VariableManager get_vars() 30575 1726867626.88336: done with get_vars() 30575 1726867626.88357: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867626.88425: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867626.88467: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867626.88722: in VariableManager get_vars() 30575 1726867626.88737: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867626.89915: iterating over new_blocks loaded from include file 30575 1726867626.89917: in VariableManager get_vars() 30575 1726867626.89927: done with get_vars() 30575 1726867626.89928: filtering new block on tags 30575 1726867626.90084: done filtering new block on tags 30575 1726867626.90086: in VariableManager get_vars() 30575 1726867626.90095: done with get_vars() 30575 1726867626.90096: filtering new block on tags 30575 1726867626.90106: done filtering new block on tags 30575 1726867626.90107: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867626.90110: extending task lists for all hosts with included blocks 30575 1726867626.90173: done extending task lists 30575 1726867626.90174: done processing included files 30575 1726867626.90174: results queue empty 30575 1726867626.90175: checking for any_errors_fatal 30575 1726867626.90178: done checking for any_errors_fatal 30575 1726867626.90178: checking for max_fail_percentage 30575 1726867626.90179: done checking for max_fail_percentage 30575 1726867626.90180: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.90180: done checking to see if all hosts have failed 30575 1726867626.90181: getting the remaining hosts for this loop 30575 1726867626.90181: done getting the remaining hosts for this loop 30575 1726867626.90183: getting the next task for host managed_node3 30575 1726867626.90185: done getting next task for host managed_node3 30575 1726867626.90187: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867626.90189: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.90195: getting variables 30575 1726867626.90196: in VariableManager get_vars() 30575 1726867626.90204: Calling all_inventory to load vars for managed_node3 30575 1726867626.90205: Calling groups_inventory to load vars for managed_node3 30575 1726867626.90206: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.90210: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.90211: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.90213: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.90870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.91704: done with get_vars() 30575 1726867626.91717: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:06 -0400 (0:00:00.099) 0:01:02.295 ****** 30575 1726867626.91760: entering _queue_task() for managed_node3/include_tasks 30575 1726867626.92032: worker is 1 (out of 1 available) 30575 1726867626.92045: exiting _queue_task() for managed_node3/include_tasks 30575 1726867626.92059: done queuing things up, now waiting for results queue to drain 30575 1726867626.92060: waiting for pending results... 30575 1726867626.92247: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867626.92334: in run() - task 0affcac9-a3a5-e081-a588-00000000145f 30575 1726867626.92347: variable 'ansible_search_path' from source: unknown 30575 1726867626.92350: variable 'ansible_search_path' from source: unknown 30575 1726867626.92379: calling self._execute() 30575 1726867626.92453: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.92456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.92466: variable 'omit' from source: magic vars 30575 1726867626.92765: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.92773: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.92781: _execute() done 30575 1726867626.92788: dumping result to json 30575 1726867626.92791: done dumping result, returning 30575 1726867626.92799: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-00000000145f] 30575 1726867626.92804: sending task result for task 0affcac9-a3a5-e081-a588-00000000145f 30575 1726867626.92889: done sending task result for task 0affcac9-a3a5-e081-a588-00000000145f 30575 1726867626.92891: WORKER PROCESS EXITING 30575 1726867626.92974: no more pending results, returning what we have 30575 1726867626.92982: in VariableManager get_vars() 30575 1726867626.93024: Calling all_inventory to load vars for managed_node3 30575 1726867626.93027: Calling groups_inventory to load vars for managed_node3 30575 1726867626.93029: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.93038: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.93041: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.93043: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.93783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.94641: done with get_vars() 30575 1726867626.94653: variable 'ansible_search_path' from source: unknown 30575 1726867626.94654: variable 'ansible_search_path' from source: unknown 30575 1726867626.94680: we have included files to process 30575 1726867626.94681: generating all_blocks data 30575 1726867626.94683: done generating all_blocks data 30575 1726867626.94684: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867626.94685: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867626.94686: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867626.95040: done processing included file 30575 1726867626.95042: iterating over new_blocks loaded from include file 30575 1726867626.95043: in VariableManager get_vars() 30575 1726867626.95056: done with get_vars() 30575 1726867626.95058: filtering new block on tags 30575 1726867626.95075: done filtering new block on tags 30575 1726867626.95078: in VariableManager get_vars() 30575 1726867626.95094: done with get_vars() 30575 1726867626.95095: filtering new block on tags 30575 1726867626.95121: done filtering new block on tags 30575 1726867626.95123: in VariableManager get_vars() 30575 1726867626.95136: done with get_vars() 30575 1726867626.95138: filtering new block on tags 30575 1726867626.95159: done filtering new block on tags 30575 1726867626.95161: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867626.95164: extending task lists for all hosts with included blocks 30575 1726867626.96106: done extending task lists 30575 1726867626.96107: done processing included files 30575 1726867626.96107: results queue empty 30575 1726867626.96108: checking for any_errors_fatal 30575 1726867626.96110: done checking for any_errors_fatal 30575 1726867626.96110: checking for max_fail_percentage 30575 1726867626.96111: done checking for max_fail_percentage 30575 1726867626.96111: checking to see if all hosts have failed and the running result is not ok 30575 1726867626.96112: done checking to see if all hosts have failed 30575 1726867626.96113: getting the remaining hosts for this loop 30575 1726867626.96114: done getting the remaining hosts for this loop 30575 1726867626.96116: getting the next task for host managed_node3 30575 1726867626.96119: done getting next task for host managed_node3 30575 1726867626.96121: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867626.96123: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867626.96129: getting variables 30575 1726867626.96130: in VariableManager get_vars() 30575 1726867626.96138: Calling all_inventory to load vars for managed_node3 30575 1726867626.96140: Calling groups_inventory to load vars for managed_node3 30575 1726867626.96141: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867626.96144: Calling all_plugins_play to load vars for managed_node3 30575 1726867626.96145: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867626.96147: Calling groups_plugins_play to load vars for managed_node3 30575 1726867626.96778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867626.97614: done with get_vars() 30575 1726867626.97628: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:06 -0400 (0:00:00.059) 0:01:02.354 ****** 30575 1726867626.97675: entering _queue_task() for managed_node3/setup 30575 1726867626.97903: worker is 1 (out of 1 available) 30575 1726867626.97916: exiting _queue_task() for managed_node3/setup 30575 1726867626.97929: done queuing things up, now waiting for results queue to drain 30575 1726867626.97931: waiting for pending results... 30575 1726867626.98105: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867626.98196: in run() - task 0affcac9-a3a5-e081-a588-0000000014b6 30575 1726867626.98208: variable 'ansible_search_path' from source: unknown 30575 1726867626.98211: variable 'ansible_search_path' from source: unknown 30575 1726867626.98241: calling self._execute() 30575 1726867626.98310: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867626.98314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867626.98325: variable 'omit' from source: magic vars 30575 1726867626.98600: variable 'ansible_distribution_major_version' from source: facts 30575 1726867626.98610: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867626.98758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867627.00223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867627.00266: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867627.00294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867627.00323: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867627.00345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867627.00403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867627.00426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867627.00445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867627.00474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867627.00486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867627.00524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867627.00540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867627.00558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867627.00587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867627.00597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867627.00703: variable '__network_required_facts' from source: role '' defaults 30575 1726867627.00710: variable 'ansible_facts' from source: unknown 30575 1726867627.01148: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867627.01152: when evaluation is False, skipping this task 30575 1726867627.01155: _execute() done 30575 1726867627.01157: dumping result to json 30575 1726867627.01160: done dumping result, returning 30575 1726867627.01167: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-0000000014b6] 30575 1726867627.01172: sending task result for task 0affcac9-a3a5-e081-a588-0000000014b6 30575 1726867627.01249: done sending task result for task 0affcac9-a3a5-e081-a588-0000000014b6 30575 1726867627.01252: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867627.01297: no more pending results, returning what we have 30575 1726867627.01300: results queue empty 30575 1726867627.01300: checking for any_errors_fatal 30575 1726867627.01302: done checking for any_errors_fatal 30575 1726867627.01303: checking for max_fail_percentage 30575 1726867627.01304: done checking for max_fail_percentage 30575 1726867627.01305: checking to see if all hosts have failed and the running result is not ok 30575 1726867627.01306: done checking to see if all hosts have failed 30575 1726867627.01307: getting the remaining hosts for this loop 30575 1726867627.01308: done getting the remaining hosts for this loop 30575 1726867627.01312: getting the next task for host managed_node3 30575 1726867627.01323: done getting next task for host managed_node3 30575 1726867627.01328: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867627.01334: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867627.01352: getting variables 30575 1726867627.01354: in VariableManager get_vars() 30575 1726867627.01398: Calling all_inventory to load vars for managed_node3 30575 1726867627.01400: Calling groups_inventory to load vars for managed_node3 30575 1726867627.01402: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867627.01410: Calling all_plugins_play to load vars for managed_node3 30575 1726867627.01413: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867627.01421: Calling groups_plugins_play to load vars for managed_node3 30575 1726867627.02185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867627.03151: done with get_vars() 30575 1726867627.03164: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:07 -0400 (0:00:00.055) 0:01:02.409 ****** 30575 1726867627.03232: entering _queue_task() for managed_node3/stat 30575 1726867627.03442: worker is 1 (out of 1 available) 30575 1726867627.03457: exiting _queue_task() for managed_node3/stat 30575 1726867627.03470: done queuing things up, now waiting for results queue to drain 30575 1726867627.03472: waiting for pending results... 30575 1726867627.03654: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867627.03749: in run() - task 0affcac9-a3a5-e081-a588-0000000014b8 30575 1726867627.03760: variable 'ansible_search_path' from source: unknown 30575 1726867627.03763: variable 'ansible_search_path' from source: unknown 30575 1726867627.03794: calling self._execute() 30575 1726867627.03865: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867627.03870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867627.03879: variable 'omit' from source: magic vars 30575 1726867627.04160: variable 'ansible_distribution_major_version' from source: facts 30575 1726867627.04169: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867627.04288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867627.04486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867627.04519: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867627.04543: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867627.04570: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867627.04636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867627.04653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867627.04672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867627.04694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867627.04754: variable '__network_is_ostree' from source: set_fact 30575 1726867627.04760: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867627.04763: when evaluation is False, skipping this task 30575 1726867627.04766: _execute() done 30575 1726867627.04768: dumping result to json 30575 1726867627.04773: done dumping result, returning 30575 1726867627.04783: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-0000000014b8] 30575 1726867627.04788: sending task result for task 0affcac9-a3a5-e081-a588-0000000014b8 30575 1726867627.04868: done sending task result for task 0affcac9-a3a5-e081-a588-0000000014b8 30575 1726867627.04871: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867627.04938: no more pending results, returning what we have 30575 1726867627.04941: results queue empty 30575 1726867627.04942: checking for any_errors_fatal 30575 1726867627.04947: done checking for any_errors_fatal 30575 1726867627.04947: checking for max_fail_percentage 30575 1726867627.04948: done checking for max_fail_percentage 30575 1726867627.04949: checking to see if all hosts have failed and the running result is not ok 30575 1726867627.04950: done checking to see if all hosts have failed 30575 1726867627.04951: getting the remaining hosts for this loop 30575 1726867627.04952: done getting the remaining hosts for this loop 30575 1726867627.04955: getting the next task for host managed_node3 30575 1726867627.04963: done getting next task for host managed_node3 30575 1726867627.04966: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867627.04971: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867627.04989: getting variables 30575 1726867627.04990: in VariableManager get_vars() 30575 1726867627.05023: Calling all_inventory to load vars for managed_node3 30575 1726867627.05026: Calling groups_inventory to load vars for managed_node3 30575 1726867627.05028: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867627.05035: Calling all_plugins_play to load vars for managed_node3 30575 1726867627.05037: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867627.05039: Calling groups_plugins_play to load vars for managed_node3 30575 1726867627.05783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867627.06656: done with get_vars() 30575 1726867627.06670: done getting variables 30575 1726867627.06712: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:07 -0400 (0:00:00.035) 0:01:02.444 ****** 30575 1726867627.06739: entering _queue_task() for managed_node3/set_fact 30575 1726867627.06942: worker is 1 (out of 1 available) 30575 1726867627.06955: exiting _queue_task() for managed_node3/set_fact 30575 1726867627.06967: done queuing things up, now waiting for results queue to drain 30575 1726867627.06969: waiting for pending results... 30575 1726867627.07138: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867627.07240: in run() - task 0affcac9-a3a5-e081-a588-0000000014b9 30575 1726867627.07250: variable 'ansible_search_path' from source: unknown 30575 1726867627.07254: variable 'ansible_search_path' from source: unknown 30575 1726867627.07283: calling self._execute() 30575 1726867627.07349: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867627.07353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867627.07363: variable 'omit' from source: magic vars 30575 1726867627.07634: variable 'ansible_distribution_major_version' from source: facts 30575 1726867627.07637: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867627.07754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867627.07943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867627.07978: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867627.08001: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867627.08027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867627.08090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867627.08108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867627.08127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867627.08144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867627.08207: variable '__network_is_ostree' from source: set_fact 30575 1726867627.08215: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867627.08219: when evaluation is False, skipping this task 30575 1726867627.08221: _execute() done 30575 1726867627.08223: dumping result to json 30575 1726867627.08226: done dumping result, returning 30575 1726867627.08233: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-0000000014b9] 30575 1726867627.08237: sending task result for task 0affcac9-a3a5-e081-a588-0000000014b9 30575 1726867627.08322: done sending task result for task 0affcac9-a3a5-e081-a588-0000000014b9 30575 1726867627.08325: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867627.08370: no more pending results, returning what we have 30575 1726867627.08373: results queue empty 30575 1726867627.08373: checking for any_errors_fatal 30575 1726867627.08380: done checking for any_errors_fatal 30575 1726867627.08381: checking for max_fail_percentage 30575 1726867627.08383: done checking for max_fail_percentage 30575 1726867627.08384: checking to see if all hosts have failed and the running result is not ok 30575 1726867627.08385: done checking to see if all hosts have failed 30575 1726867627.08385: getting the remaining hosts for this loop 30575 1726867627.08387: done getting the remaining hosts for this loop 30575 1726867627.08390: getting the next task for host managed_node3 30575 1726867627.08399: done getting next task for host managed_node3 30575 1726867627.08402: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867627.08407: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867627.08425: getting variables 30575 1726867627.08427: in VariableManager get_vars() 30575 1726867627.08460: Calling all_inventory to load vars for managed_node3 30575 1726867627.08462: Calling groups_inventory to load vars for managed_node3 30575 1726867627.08464: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867627.08471: Calling all_plugins_play to load vars for managed_node3 30575 1726867627.08473: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867627.08476: Calling groups_plugins_play to load vars for managed_node3 30575 1726867627.09353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867627.10203: done with get_vars() 30575 1726867627.10217: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:07 -0400 (0:00:00.035) 0:01:02.480 ****** 30575 1726867627.10280: entering _queue_task() for managed_node3/service_facts 30575 1726867627.10469: worker is 1 (out of 1 available) 30575 1726867627.10484: exiting _queue_task() for managed_node3/service_facts 30575 1726867627.10496: done queuing things up, now waiting for results queue to drain 30575 1726867627.10498: waiting for pending results... 30575 1726867627.10682: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867627.10767: in run() - task 0affcac9-a3a5-e081-a588-0000000014bb 30575 1726867627.10781: variable 'ansible_search_path' from source: unknown 30575 1726867627.10784: variable 'ansible_search_path' from source: unknown 30575 1726867627.10811: calling self._execute() 30575 1726867627.10884: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867627.10888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867627.10898: variable 'omit' from source: magic vars 30575 1726867627.11172: variable 'ansible_distribution_major_version' from source: facts 30575 1726867627.11181: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867627.11188: variable 'omit' from source: magic vars 30575 1726867627.11235: variable 'omit' from source: magic vars 30575 1726867627.11258: variable 'omit' from source: magic vars 30575 1726867627.11291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867627.11320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867627.11335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867627.11347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867627.11357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867627.11384: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867627.11388: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867627.11390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867627.11458: Set connection var ansible_pipelining to False 30575 1726867627.11461: Set connection var ansible_shell_type to sh 30575 1726867627.11466: Set connection var ansible_shell_executable to /bin/sh 30575 1726867627.11471: Set connection var ansible_timeout to 10 30575 1726867627.11476: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867627.11487: Set connection var ansible_connection to ssh 30575 1726867627.11505: variable 'ansible_shell_executable' from source: unknown 30575 1726867627.11508: variable 'ansible_connection' from source: unknown 30575 1726867627.11511: variable 'ansible_module_compression' from source: unknown 30575 1726867627.11513: variable 'ansible_shell_type' from source: unknown 30575 1726867627.11516: variable 'ansible_shell_executable' from source: unknown 30575 1726867627.11520: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867627.11523: variable 'ansible_pipelining' from source: unknown 30575 1726867627.11526: variable 'ansible_timeout' from source: unknown 30575 1726867627.11530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867627.11668: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867627.11678: variable 'omit' from source: magic vars 30575 1726867627.11682: starting attempt loop 30575 1726867627.11686: running the handler 30575 1726867627.11700: _low_level_execute_command(): starting 30575 1726867627.11705: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867627.12209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867627.12216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867627.12220: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.12257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867627.12269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867627.12334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867627.14028: stdout chunk (state=3): >>>/root <<< 30575 1726867627.14124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867627.14152: stderr chunk (state=3): >>><<< 30575 1726867627.14155: stdout chunk (state=3): >>><<< 30575 1726867627.14172: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867627.14184: _low_level_execute_command(): starting 30575 1726867627.14189: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984 `" && echo ansible-tmp-1726867627.1417115-33574-268872699682984="` echo /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984 `" ) && sleep 0' 30575 1726867627.14613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867627.14617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867627.14619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867627.14628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867627.14631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.14681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867627.14685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867627.14687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867627.14723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867627.16586: stdout chunk (state=3): >>>ansible-tmp-1726867627.1417115-33574-268872699682984=/root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984 <<< 30575 1726867627.16691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867627.16715: stderr chunk (state=3): >>><<< 30575 1726867627.16719: stdout chunk (state=3): >>><<< 30575 1726867627.16732: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867627.1417115-33574-268872699682984=/root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867627.16766: variable 'ansible_module_compression' from source: unknown 30575 1726867627.16801: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867627.16834: variable 'ansible_facts' from source: unknown 30575 1726867627.16893: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/AnsiballZ_service_facts.py 30575 1726867627.16989: Sending initial data 30575 1726867627.16992: Sent initial data (162 bytes) 30575 1726867627.17422: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867627.17425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867627.17428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.17430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867627.17432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.17476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867627.17489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867627.17530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867627.19049: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867627.19056: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867627.19094: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867627.19141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpijaygx4_ /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/AnsiballZ_service_facts.py <<< 30575 1726867627.19144: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/AnsiballZ_service_facts.py" <<< 30575 1726867627.19186: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpijaygx4_" to remote "/root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/AnsiballZ_service_facts.py" <<< 30575 1726867627.19734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867627.19769: stderr chunk (state=3): >>><<< 30575 1726867627.19772: stdout chunk (state=3): >>><<< 30575 1726867627.19824: done transferring module to remote 30575 1726867627.19832: _low_level_execute_command(): starting 30575 1726867627.19835: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/ /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/AnsiballZ_service_facts.py && sleep 0' 30575 1726867627.20257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867627.20260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867627.20262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.20264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867627.20269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867627.20271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.20316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867627.20321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867627.20370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867627.22105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867627.22128: stderr chunk (state=3): >>><<< 30575 1726867627.22132: stdout chunk (state=3): >>><<< 30575 1726867627.22142: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867627.22145: _low_level_execute_command(): starting 30575 1726867627.22150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/AnsiballZ_service_facts.py && sleep 0' 30575 1726867627.22550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867627.22554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.22556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867627.22558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867627.22608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867627.22612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867627.22665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867628.73315: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30575 1726867628.73326: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 30575 1726867628.73334: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30575 1726867628.73337: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867628.74798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867628.74825: stderr chunk (state=3): >>><<< 30575 1726867628.74838: stdout chunk (state=3): >>><<< 30575 1726867628.74900: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867628.75685: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867628.75690: _low_level_execute_command(): starting 30575 1726867628.75693: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867627.1417115-33574-268872699682984/ > /dev/null 2>&1 && sleep 0' 30575 1726867628.76329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867628.76366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867628.76393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867628.76462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867628.76481: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867628.76508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867628.76527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867628.76549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867628.76623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867628.78598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867628.78601: stdout chunk (state=3): >>><<< 30575 1726867628.78604: stderr chunk (state=3): >>><<< 30575 1726867628.78990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867628.78993: handler run complete 30575 1726867628.79098: variable 'ansible_facts' from source: unknown 30575 1726867628.79360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867628.80374: variable 'ansible_facts' from source: unknown 30575 1726867628.80692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867628.81160: attempt loop complete, returning result 30575 1726867628.81292: _execute() done 30575 1726867628.81304: dumping result to json 30575 1726867628.81372: done dumping result, returning 30575 1726867628.81392: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-0000000014bb] 30575 1726867628.81529: sending task result for task 0affcac9-a3a5-e081-a588-0000000014bb ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867628.82467: no more pending results, returning what we have 30575 1726867628.82470: results queue empty 30575 1726867628.82471: checking for any_errors_fatal 30575 1726867628.82476: done checking for any_errors_fatal 30575 1726867628.82478: checking for max_fail_percentage 30575 1726867628.82480: done checking for max_fail_percentage 30575 1726867628.82481: checking to see if all hosts have failed and the running result is not ok 30575 1726867628.82483: done checking to see if all hosts have failed 30575 1726867628.82484: getting the remaining hosts for this loop 30575 1726867628.82485: done getting the remaining hosts for this loop 30575 1726867628.82492: getting the next task for host managed_node3 30575 1726867628.82499: done getting next task for host managed_node3 30575 1726867628.82503: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867628.82508: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867628.82520: getting variables 30575 1726867628.82521: in VariableManager get_vars() 30575 1726867628.82553: Calling all_inventory to load vars for managed_node3 30575 1726867628.82555: Calling groups_inventory to load vars for managed_node3 30575 1726867628.82558: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867628.82566: Calling all_plugins_play to load vars for managed_node3 30575 1726867628.82569: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867628.82572: Calling groups_plugins_play to load vars for managed_node3 30575 1726867628.82586: done sending task result for task 0affcac9-a3a5-e081-a588-0000000014bb 30575 1726867628.82603: WORKER PROCESS EXITING 30575 1726867628.84162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867628.85968: done with get_vars() 30575 1726867628.85990: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:08 -0400 (0:00:01.758) 0:01:04.238 ****** 30575 1726867628.86089: entering _queue_task() for managed_node3/package_facts 30575 1726867628.86394: worker is 1 (out of 1 available) 30575 1726867628.86405: exiting _queue_task() for managed_node3/package_facts 30575 1726867628.86418: done queuing things up, now waiting for results queue to drain 30575 1726867628.86420: waiting for pending results... 30575 1726867628.86698: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867628.86848: in run() - task 0affcac9-a3a5-e081-a588-0000000014bc 30575 1726867628.86870: variable 'ansible_search_path' from source: unknown 30575 1726867628.86880: variable 'ansible_search_path' from source: unknown 30575 1726867628.86924: calling self._execute() 30575 1726867628.87022: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867628.87033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867628.87046: variable 'omit' from source: magic vars 30575 1726867628.87432: variable 'ansible_distribution_major_version' from source: facts 30575 1726867628.87456: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867628.87555: variable 'omit' from source: magic vars 30575 1726867628.87558: variable 'omit' from source: magic vars 30575 1726867628.87583: variable 'omit' from source: magic vars 30575 1726867628.87622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867628.87661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867628.87692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867628.87714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867628.87734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867628.87780: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867628.87794: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867628.87886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867628.87926: Set connection var ansible_pipelining to False 30575 1726867628.87938: Set connection var ansible_shell_type to sh 30575 1726867628.87950: Set connection var ansible_shell_executable to /bin/sh 30575 1726867628.87960: Set connection var ansible_timeout to 10 30575 1726867628.87969: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867628.87984: Set connection var ansible_connection to ssh 30575 1726867628.88024: variable 'ansible_shell_executable' from source: unknown 30575 1726867628.88033: variable 'ansible_connection' from source: unknown 30575 1726867628.88041: variable 'ansible_module_compression' from source: unknown 30575 1726867628.88048: variable 'ansible_shell_type' from source: unknown 30575 1726867628.88054: variable 'ansible_shell_executable' from source: unknown 30575 1726867628.88063: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867628.88071: variable 'ansible_pipelining' from source: unknown 30575 1726867628.88081: variable 'ansible_timeout' from source: unknown 30575 1726867628.88089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867628.88308: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867628.88382: variable 'omit' from source: magic vars 30575 1726867628.88385: starting attempt loop 30575 1726867628.88388: running the handler 30575 1726867628.88390: _low_level_execute_command(): starting 30575 1726867628.88392: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867628.89205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867628.89256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867628.89269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867628.89512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867628.91204: stdout chunk (state=3): >>>/root <<< 30575 1726867628.91342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867628.91367: stdout chunk (state=3): >>><<< 30575 1726867628.91373: stderr chunk (state=3): >>><<< 30575 1726867628.91399: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867628.91490: _low_level_execute_command(): starting 30575 1726867628.91494: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442 `" && echo ansible-tmp-1726867628.9140565-33627-143387026539442="` echo /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442 `" ) && sleep 0' 30575 1726867628.92040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867628.92053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867628.92070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867628.92096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867628.92148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867628.92228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867628.92257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867628.92291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867628.92331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867628.94238: stdout chunk (state=3): >>>ansible-tmp-1726867628.9140565-33627-143387026539442=/root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442 <<< 30575 1726867628.94402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867628.94405: stdout chunk (state=3): >>><<< 30575 1726867628.94408: stderr chunk (state=3): >>><<< 30575 1726867628.94423: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867628.9140565-33627-143387026539442=/root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867628.94488: variable 'ansible_module_compression' from source: unknown 30575 1726867628.94535: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867628.94614: variable 'ansible_facts' from source: unknown 30575 1726867628.94818: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/AnsiballZ_package_facts.py 30575 1726867628.95003: Sending initial data 30575 1726867628.95006: Sent initial data (162 bytes) 30575 1726867628.95581: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867628.95584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867628.95595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867628.95695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867628.95706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867628.95755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867628.95759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867628.95805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867628.97372: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867628.97390: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30575 1726867628.97414: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867628.97480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867628.97531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpyfvvk7yu /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/AnsiballZ_package_facts.py <<< 30575 1726867628.97534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/AnsiballZ_package_facts.py" <<< 30575 1726867628.97576: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpyfvvk7yu" to remote "/root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/AnsiballZ_package_facts.py" <<< 30575 1726867628.99042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867628.99071: stderr chunk (state=3): >>><<< 30575 1726867628.99184: stdout chunk (state=3): >>><<< 30575 1726867628.99187: done transferring module to remote 30575 1726867628.99189: _low_level_execute_command(): starting 30575 1726867628.99192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/ /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/AnsiballZ_package_facts.py && sleep 0' 30575 1726867628.99728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867628.99781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867628.99801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867628.99837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867628.99915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867629.01746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867629.01757: stdout chunk (state=3): >>><<< 30575 1726867629.01768: stderr chunk (state=3): >>><<< 30575 1726867629.01790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867629.01800: _low_level_execute_command(): starting 30575 1726867629.01821: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/AnsiballZ_package_facts.py && sleep 0' 30575 1726867629.02506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867629.02562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867629.02644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867629.02689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867629.02706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867629.02737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867629.02814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867629.47300: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30575 1726867629.47347: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30575 1726867629.47364: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30575 1726867629.47373: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30575 1726867629.47386: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30575 1726867629.47398: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30575 1726867629.47443: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30575 1726867629.47470: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867629.47475: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30575 1726867629.47501: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30575 1726867629.47523: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30575 1726867629.47531: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867629.49292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867629.49316: stderr chunk (state=3): >>><<< 30575 1726867629.49341: stdout chunk (state=3): >>><<< 30575 1726867629.49443: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867629.51838: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867629.51862: _low_level_execute_command(): starting 30575 1726867629.51872: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867628.9140565-33627-143387026539442/ > /dev/null 2>&1 && sleep 0' 30575 1726867629.52683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867629.52690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867629.52711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867629.52726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867629.52808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867629.54656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867629.54707: stderr chunk (state=3): >>><<< 30575 1726867629.54719: stdout chunk (state=3): >>><<< 30575 1726867629.54740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867629.54751: handler run complete 30575 1726867629.55636: variable 'ansible_facts' from source: unknown 30575 1726867629.56222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867629.58373: variable 'ansible_facts' from source: unknown 30575 1726867629.59489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867629.60287: attempt loop complete, returning result 30575 1726867629.60304: _execute() done 30575 1726867629.60318: dumping result to json 30575 1726867629.60539: done dumping result, returning 30575 1726867629.60554: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-0000000014bc] 30575 1726867629.60564: sending task result for task 0affcac9-a3a5-e081-a588-0000000014bc 30575 1726867629.63289: done sending task result for task 0affcac9-a3a5-e081-a588-0000000014bc 30575 1726867629.63293: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867629.63467: no more pending results, returning what we have 30575 1726867629.63470: results queue empty 30575 1726867629.63471: checking for any_errors_fatal 30575 1726867629.63480: done checking for any_errors_fatal 30575 1726867629.63481: checking for max_fail_percentage 30575 1726867629.63483: done checking for max_fail_percentage 30575 1726867629.63484: checking to see if all hosts have failed and the running result is not ok 30575 1726867629.63485: done checking to see if all hosts have failed 30575 1726867629.63489: getting the remaining hosts for this loop 30575 1726867629.63491: done getting the remaining hosts for this loop 30575 1726867629.63495: getting the next task for host managed_node3 30575 1726867629.63503: done getting next task for host managed_node3 30575 1726867629.63507: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867629.63512: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867629.63524: getting variables 30575 1726867629.63526: in VariableManager get_vars() 30575 1726867629.63557: Calling all_inventory to load vars for managed_node3 30575 1726867629.63560: Calling groups_inventory to load vars for managed_node3 30575 1726867629.63562: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867629.63571: Calling all_plugins_play to load vars for managed_node3 30575 1726867629.63574: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867629.63580: Calling groups_plugins_play to load vars for managed_node3 30575 1726867629.64843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867629.66461: done with get_vars() 30575 1726867629.66484: done getting variables 30575 1726867629.66545: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:09 -0400 (0:00:00.804) 0:01:05.043 ****** 30575 1726867629.66585: entering _queue_task() for managed_node3/debug 30575 1726867629.67105: worker is 1 (out of 1 available) 30575 1726867629.67115: exiting _queue_task() for managed_node3/debug 30575 1726867629.67126: done queuing things up, now waiting for results queue to drain 30575 1726867629.67127: waiting for pending results... 30575 1726867629.67258: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867629.67343: in run() - task 0affcac9-a3a5-e081-a588-000000001460 30575 1726867629.67372: variable 'ansible_search_path' from source: unknown 30575 1726867629.67387: variable 'ansible_search_path' from source: unknown 30575 1726867629.67431: calling self._execute() 30575 1726867629.67537: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867629.67551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867629.67684: variable 'omit' from source: magic vars 30575 1726867629.67960: variable 'ansible_distribution_major_version' from source: facts 30575 1726867629.67981: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867629.67995: variable 'omit' from source: magic vars 30575 1726867629.68069: variable 'omit' from source: magic vars 30575 1726867629.68183: variable 'network_provider' from source: set_fact 30575 1726867629.68207: variable 'omit' from source: magic vars 30575 1726867629.68284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867629.68302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867629.68333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867629.68357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867629.68413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867629.68416: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867629.68423: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867629.68440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867629.68551: Set connection var ansible_pipelining to False 30575 1726867629.68561: Set connection var ansible_shell_type to sh 30575 1726867629.68572: Set connection var ansible_shell_executable to /bin/sh 30575 1726867629.68657: Set connection var ansible_timeout to 10 30575 1726867629.68660: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867629.68663: Set connection var ansible_connection to ssh 30575 1726867629.68665: variable 'ansible_shell_executable' from source: unknown 30575 1726867629.68668: variable 'ansible_connection' from source: unknown 30575 1726867629.68670: variable 'ansible_module_compression' from source: unknown 30575 1726867629.68672: variable 'ansible_shell_type' from source: unknown 30575 1726867629.68674: variable 'ansible_shell_executable' from source: unknown 30575 1726867629.68679: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867629.68682: variable 'ansible_pipelining' from source: unknown 30575 1726867629.68684: variable 'ansible_timeout' from source: unknown 30575 1726867629.68686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867629.68822: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867629.68842: variable 'omit' from source: magic vars 30575 1726867629.68852: starting attempt loop 30575 1726867629.68860: running the handler 30575 1726867629.68917: handler run complete 30575 1726867629.68940: attempt loop complete, returning result 30575 1726867629.68948: _execute() done 30575 1726867629.68984: dumping result to json 30575 1726867629.68987: done dumping result, returning 30575 1726867629.68990: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000001460] 30575 1726867629.68992: sending task result for task 0affcac9-a3a5-e081-a588-000000001460 ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867629.69157: no more pending results, returning what we have 30575 1726867629.69161: results queue empty 30575 1726867629.69162: checking for any_errors_fatal 30575 1726867629.69171: done checking for any_errors_fatal 30575 1726867629.69172: checking for max_fail_percentage 30575 1726867629.69174: done checking for max_fail_percentage 30575 1726867629.69175: checking to see if all hosts have failed and the running result is not ok 30575 1726867629.69176: done checking to see if all hosts have failed 30575 1726867629.69179: getting the remaining hosts for this loop 30575 1726867629.69180: done getting the remaining hosts for this loop 30575 1726867629.69184: getting the next task for host managed_node3 30575 1726867629.69308: done getting next task for host managed_node3 30575 1726867629.69313: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867629.69319: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867629.69331: getting variables 30575 1726867629.69333: in VariableManager get_vars() 30575 1726867629.69374: Calling all_inventory to load vars for managed_node3 30575 1726867629.69418: Calling groups_inventory to load vars for managed_node3 30575 1726867629.69422: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867629.69429: done sending task result for task 0affcac9-a3a5-e081-a588-000000001460 30575 1726867629.69432: WORKER PROCESS EXITING 30575 1726867629.69442: Calling all_plugins_play to load vars for managed_node3 30575 1726867629.69445: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867629.69449: Calling groups_plugins_play to load vars for managed_node3 30575 1726867629.70918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867629.72623: done with get_vars() 30575 1726867629.72645: done getting variables 30575 1726867629.72703: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:09 -0400 (0:00:00.061) 0:01:05.104 ****** 30575 1726867629.72746: entering _queue_task() for managed_node3/fail 30575 1726867629.73045: worker is 1 (out of 1 available) 30575 1726867629.73281: exiting _queue_task() for managed_node3/fail 30575 1726867629.73293: done queuing things up, now waiting for results queue to drain 30575 1726867629.73295: waiting for pending results... 30575 1726867629.73358: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867629.73487: in run() - task 0affcac9-a3a5-e081-a588-000000001461 30575 1726867629.73507: variable 'ansible_search_path' from source: unknown 30575 1726867629.73523: variable 'ansible_search_path' from source: unknown 30575 1726867629.73562: calling self._execute() 30575 1726867629.73665: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867629.73679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867629.73694: variable 'omit' from source: magic vars 30575 1726867629.74092: variable 'ansible_distribution_major_version' from source: facts 30575 1726867629.74110: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867629.74285: variable 'network_state' from source: role '' defaults 30575 1726867629.74290: Evaluated conditional (network_state != {}): False 30575 1726867629.74293: when evaluation is False, skipping this task 30575 1726867629.74295: _execute() done 30575 1726867629.74297: dumping result to json 30575 1726867629.74299: done dumping result, returning 30575 1726867629.74301: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-000000001461] 30575 1726867629.74304: sending task result for task 0affcac9-a3a5-e081-a588-000000001461 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867629.74551: no more pending results, returning what we have 30575 1726867629.74556: results queue empty 30575 1726867629.74557: checking for any_errors_fatal 30575 1726867629.74564: done checking for any_errors_fatal 30575 1726867629.74565: checking for max_fail_percentage 30575 1726867629.74567: done checking for max_fail_percentage 30575 1726867629.74568: checking to see if all hosts have failed and the running result is not ok 30575 1726867629.74569: done checking to see if all hosts have failed 30575 1726867629.74570: getting the remaining hosts for this loop 30575 1726867629.74571: done getting the remaining hosts for this loop 30575 1726867629.74575: getting the next task for host managed_node3 30575 1726867629.74587: done getting next task for host managed_node3 30575 1726867629.74591: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867629.74597: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867629.74625: getting variables 30575 1726867629.74627: in VariableManager get_vars() 30575 1726867629.74672: Calling all_inventory to load vars for managed_node3 30575 1726867629.74676: Calling groups_inventory to load vars for managed_node3 30575 1726867629.74785: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867629.74796: done sending task result for task 0affcac9-a3a5-e081-a588-000000001461 30575 1726867629.74799: WORKER PROCESS EXITING 30575 1726867629.74808: Calling all_plugins_play to load vars for managed_node3 30575 1726867629.74812: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867629.74815: Calling groups_plugins_play to load vars for managed_node3 30575 1726867629.76400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867629.77937: done with get_vars() 30575 1726867629.77959: done getting variables 30575 1726867629.78020: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:09 -0400 (0:00:00.053) 0:01:05.158 ****** 30575 1726867629.78060: entering _queue_task() for managed_node3/fail 30575 1726867629.78582: worker is 1 (out of 1 available) 30575 1726867629.78592: exiting _queue_task() for managed_node3/fail 30575 1726867629.78602: done queuing things up, now waiting for results queue to drain 30575 1726867629.78603: waiting for pending results... 30575 1726867629.78684: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867629.78827: in run() - task 0affcac9-a3a5-e081-a588-000000001462 30575 1726867629.78939: variable 'ansible_search_path' from source: unknown 30575 1726867629.78943: variable 'ansible_search_path' from source: unknown 30575 1726867629.78947: calling self._execute() 30575 1726867629.79000: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867629.79012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867629.79025: variable 'omit' from source: magic vars 30575 1726867629.79414: variable 'ansible_distribution_major_version' from source: facts 30575 1726867629.79430: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867629.79556: variable 'network_state' from source: role '' defaults 30575 1726867629.79592: Evaluated conditional (network_state != {}): False 30575 1726867629.79600: when evaluation is False, skipping this task 30575 1726867629.79603: _execute() done 30575 1726867629.79605: dumping result to json 30575 1726867629.79607: done dumping result, returning 30575 1726867629.79683: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-000000001462] 30575 1726867629.79686: sending task result for task 0affcac9-a3a5-e081-a588-000000001462 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867629.79831: no more pending results, returning what we have 30575 1726867629.79836: results queue empty 30575 1726867629.79837: checking for any_errors_fatal 30575 1726867629.79848: done checking for any_errors_fatal 30575 1726867629.79848: checking for max_fail_percentage 30575 1726867629.79850: done checking for max_fail_percentage 30575 1726867629.79851: checking to see if all hosts have failed and the running result is not ok 30575 1726867629.79852: done checking to see if all hosts have failed 30575 1726867629.79853: getting the remaining hosts for this loop 30575 1726867629.79855: done getting the remaining hosts for this loop 30575 1726867629.79859: getting the next task for host managed_node3 30575 1726867629.79869: done getting next task for host managed_node3 30575 1726867629.79873: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867629.79880: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867629.79906: getting variables 30575 1726867629.79907: in VariableManager get_vars() 30575 1726867629.79947: Calling all_inventory to load vars for managed_node3 30575 1726867629.79949: Calling groups_inventory to load vars for managed_node3 30575 1726867629.79952: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867629.79964: Calling all_plugins_play to load vars for managed_node3 30575 1726867629.79967: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867629.79969: Calling groups_plugins_play to load vars for managed_node3 30575 1726867629.80493: done sending task result for task 0affcac9-a3a5-e081-a588-000000001462 30575 1726867629.80496: WORKER PROCESS EXITING 30575 1726867629.81481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867629.83249: done with get_vars() 30575 1726867629.83274: done getting variables 30575 1726867629.83333: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:09 -0400 (0:00:00.053) 0:01:05.211 ****** 30575 1726867629.83372: entering _queue_task() for managed_node3/fail 30575 1726867629.83670: worker is 1 (out of 1 available) 30575 1726867629.83798: exiting _queue_task() for managed_node3/fail 30575 1726867629.83808: done queuing things up, now waiting for results queue to drain 30575 1726867629.83810: waiting for pending results... 30575 1726867629.84012: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867629.84165: in run() - task 0affcac9-a3a5-e081-a588-000000001463 30575 1726867629.84188: variable 'ansible_search_path' from source: unknown 30575 1726867629.84197: variable 'ansible_search_path' from source: unknown 30575 1726867629.84240: calling self._execute() 30575 1726867629.84373: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867629.84387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867629.84402: variable 'omit' from source: magic vars 30575 1726867629.84961: variable 'ansible_distribution_major_version' from source: facts 30575 1726867629.84982: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867629.85176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867629.88065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867629.88127: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867629.88211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867629.88235: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867629.88265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867629.88360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867629.88429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867629.88437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867629.88486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867629.88510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867629.88647: variable 'ansible_distribution_major_version' from source: facts 30575 1726867629.88651: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867629.88776: variable 'ansible_distribution' from source: facts 30575 1726867629.88790: variable '__network_rh_distros' from source: role '' defaults 30575 1726867629.88803: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867629.89152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867629.89155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867629.89158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867629.89183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867629.89209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867629.89266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867629.89303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867629.89334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867629.89386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867629.89409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867629.89452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867629.89486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867629.89522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867629.89584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867629.89587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867629.89954: variable 'network_connections' from source: include params 30575 1726867629.90018: variable 'interface' from source: play vars 30575 1726867629.90039: variable 'interface' from source: play vars 30575 1726867629.90055: variable 'network_state' from source: role '' defaults 30575 1726867629.90135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867629.90317: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867629.90364: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867629.90406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867629.90438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867629.90491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867629.90561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867629.90572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867629.90590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867629.90625: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867629.90684: when evaluation is False, skipping this task 30575 1726867629.90686: _execute() done 30575 1726867629.90689: dumping result to json 30575 1726867629.90691: done dumping result, returning 30575 1726867629.90693: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000001463] 30575 1726867629.90695: sending task result for task 0affcac9-a3a5-e081-a588-000000001463 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867629.90897: no more pending results, returning what we have 30575 1726867629.90901: results queue empty 30575 1726867629.90902: checking for any_errors_fatal 30575 1726867629.90907: done checking for any_errors_fatal 30575 1726867629.90908: checking for max_fail_percentage 30575 1726867629.90910: done checking for max_fail_percentage 30575 1726867629.90911: checking to see if all hosts have failed and the running result is not ok 30575 1726867629.90912: done checking to see if all hosts have failed 30575 1726867629.90913: getting the remaining hosts for this loop 30575 1726867629.90914: done getting the remaining hosts for this loop 30575 1726867629.90918: getting the next task for host managed_node3 30575 1726867629.90933: done getting next task for host managed_node3 30575 1726867629.90938: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867629.90943: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867629.91195: getting variables 30575 1726867629.91197: in VariableManager get_vars() 30575 1726867629.91232: Calling all_inventory to load vars for managed_node3 30575 1726867629.91235: Calling groups_inventory to load vars for managed_node3 30575 1726867629.91237: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867629.91245: Calling all_plugins_play to load vars for managed_node3 30575 1726867629.91248: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867629.91251: Calling groups_plugins_play to load vars for managed_node3 30575 1726867629.91982: done sending task result for task 0affcac9-a3a5-e081-a588-000000001463 30575 1726867629.91985: WORKER PROCESS EXITING 30575 1726867629.92893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867629.95583: done with get_vars() 30575 1726867629.95610: done getting variables 30575 1726867629.95731: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:09 -0400 (0:00:00.124) 0:01:05.335 ****** 30575 1726867629.95799: entering _queue_task() for managed_node3/dnf 30575 1726867629.96150: worker is 1 (out of 1 available) 30575 1726867629.96162: exiting _queue_task() for managed_node3/dnf 30575 1726867629.96175: done queuing things up, now waiting for results queue to drain 30575 1726867629.96176: waiting for pending results... 30575 1726867629.96483: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867629.96635: in run() - task 0affcac9-a3a5-e081-a588-000000001464 30575 1726867629.96662: variable 'ansible_search_path' from source: unknown 30575 1726867629.96669: variable 'ansible_search_path' from source: unknown 30575 1726867629.96714: calling self._execute() 30575 1726867629.96814: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867629.96826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867629.96843: variable 'omit' from source: magic vars 30575 1726867629.97233: variable 'ansible_distribution_major_version' from source: facts 30575 1726867629.97302: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867629.97458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867630.02761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867630.02852: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867630.03017: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867630.03056: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867630.03109: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867630.03262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.03331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.03435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.03518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.03539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.03867: variable 'ansible_distribution' from source: facts 30575 1726867630.03882: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.03901: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867630.04126: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867630.04363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.04425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.04458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.04561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.04783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.04786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.04789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.04791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.04885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.04973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.05019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.05092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.05123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.05216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.05300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.05567: variable 'network_connections' from source: include params 30575 1726867630.05826: variable 'interface' from source: play vars 30575 1726867630.05830: variable 'interface' from source: play vars 30575 1726867630.05926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867630.06340: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867630.06385: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867630.06684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867630.06687: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867630.07084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867630.07087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867630.07097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.07099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867630.07101: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867630.07658: variable 'network_connections' from source: include params 30575 1726867630.07743: variable 'interface' from source: play vars 30575 1726867630.07813: variable 'interface' from source: play vars 30575 1726867630.07872: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867630.07924: when evaluation is False, skipping this task 30575 1726867630.07962: _execute() done 30575 1726867630.07972: dumping result to json 30575 1726867630.07984: done dumping result, returning 30575 1726867630.08073: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001464] 30575 1726867630.08085: sending task result for task 0affcac9-a3a5-e081-a588-000000001464 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867630.08239: no more pending results, returning what we have 30575 1726867630.08243: results queue empty 30575 1726867630.08244: checking for any_errors_fatal 30575 1726867630.08253: done checking for any_errors_fatal 30575 1726867630.08253: checking for max_fail_percentage 30575 1726867630.08255: done checking for max_fail_percentage 30575 1726867630.08256: checking to see if all hosts have failed and the running result is not ok 30575 1726867630.08257: done checking to see if all hosts have failed 30575 1726867630.08258: getting the remaining hosts for this loop 30575 1726867630.08260: done getting the remaining hosts for this loop 30575 1726867630.08263: getting the next task for host managed_node3 30575 1726867630.08274: done getting next task for host managed_node3 30575 1726867630.08281: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867630.08286: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867630.08309: getting variables 30575 1726867630.08311: in VariableManager get_vars() 30575 1726867630.08353: Calling all_inventory to load vars for managed_node3 30575 1726867630.08355: Calling groups_inventory to load vars for managed_node3 30575 1726867630.08358: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867630.08369: Calling all_plugins_play to load vars for managed_node3 30575 1726867630.08372: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867630.08374: Calling groups_plugins_play to load vars for managed_node3 30575 1726867630.09564: done sending task result for task 0affcac9-a3a5-e081-a588-000000001464 30575 1726867630.09567: WORKER PROCESS EXITING 30575 1726867630.11570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867630.15278: done with get_vars() 30575 1726867630.15301: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867630.15485: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:10 -0400 (0:00:00.197) 0:01:05.532 ****** 30575 1726867630.15520: entering _queue_task() for managed_node3/yum 30575 1726867630.16172: worker is 1 (out of 1 available) 30575 1726867630.16186: exiting _queue_task() for managed_node3/yum 30575 1726867630.16201: done queuing things up, now waiting for results queue to drain 30575 1726867630.16203: waiting for pending results... 30575 1726867630.16686: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867630.17183: in run() - task 0affcac9-a3a5-e081-a588-000000001465 30575 1726867630.17187: variable 'ansible_search_path' from source: unknown 30575 1726867630.17190: variable 'ansible_search_path' from source: unknown 30575 1726867630.17193: calling self._execute() 30575 1726867630.17195: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.17198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.17200: variable 'omit' from source: magic vars 30575 1726867630.17941: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.18384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867630.18388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867630.22993: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867630.23079: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867630.23483: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867630.23486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867630.23489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867630.23492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.23689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.23724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.23768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.23790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.24282: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.24285: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867630.24287: when evaluation is False, skipping this task 30575 1726867630.24289: _execute() done 30575 1726867630.24291: dumping result to json 30575 1726867630.24293: done dumping result, returning 30575 1726867630.24295: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001465] 30575 1726867630.24297: sending task result for task 0affcac9-a3a5-e081-a588-000000001465 30575 1726867630.24374: done sending task result for task 0affcac9-a3a5-e081-a588-000000001465 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867630.24431: no more pending results, returning what we have 30575 1726867630.24435: results queue empty 30575 1726867630.24436: checking for any_errors_fatal 30575 1726867630.24444: done checking for any_errors_fatal 30575 1726867630.24444: checking for max_fail_percentage 30575 1726867630.24446: done checking for max_fail_percentage 30575 1726867630.24447: checking to see if all hosts have failed and the running result is not ok 30575 1726867630.24448: done checking to see if all hosts have failed 30575 1726867630.24449: getting the remaining hosts for this loop 30575 1726867630.24451: done getting the remaining hosts for this loop 30575 1726867630.24455: getting the next task for host managed_node3 30575 1726867630.24465: done getting next task for host managed_node3 30575 1726867630.24469: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867630.24475: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867630.24504: getting variables 30575 1726867630.24505: in VariableManager get_vars() 30575 1726867630.24544: Calling all_inventory to load vars for managed_node3 30575 1726867630.24547: Calling groups_inventory to load vars for managed_node3 30575 1726867630.24549: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867630.24560: Calling all_plugins_play to load vars for managed_node3 30575 1726867630.24563: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867630.24566: Calling groups_plugins_play to load vars for managed_node3 30575 1726867630.25185: WORKER PROCESS EXITING 30575 1726867630.27107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867630.28688: done with get_vars() 30575 1726867630.28719: done getting variables 30575 1726867630.28794: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:10 -0400 (0:00:00.133) 0:01:05.665 ****** 30575 1726867630.28842: entering _queue_task() for managed_node3/fail 30575 1726867630.29252: worker is 1 (out of 1 available) 30575 1726867630.29265: exiting _queue_task() for managed_node3/fail 30575 1726867630.29282: done queuing things up, now waiting for results queue to drain 30575 1726867630.29284: waiting for pending results... 30575 1726867630.29603: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867630.29770: in run() - task 0affcac9-a3a5-e081-a588-000000001466 30575 1726867630.29793: variable 'ansible_search_path' from source: unknown 30575 1726867630.29802: variable 'ansible_search_path' from source: unknown 30575 1726867630.29851: calling self._execute() 30575 1726867630.29952: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.29964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.29979: variable 'omit' from source: magic vars 30575 1726867630.30386: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.30402: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867630.30531: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867630.30731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867630.33021: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867630.33126: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867630.33217: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867630.33232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867630.33263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867630.33361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.33398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.33484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.33494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.33520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.33586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.33616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.33654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.33760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.33764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.33798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.33833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.33872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.33921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.33978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.34142: variable 'network_connections' from source: include params 30575 1726867630.34158: variable 'interface' from source: play vars 30575 1726867630.34241: variable 'interface' from source: play vars 30575 1726867630.34328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867630.34504: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867630.34583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867630.34592: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867630.34632: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867630.34681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867630.35104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867630.35176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.35187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867630.35243: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867630.35510: variable 'network_connections' from source: include params 30575 1726867630.35523: variable 'interface' from source: play vars 30575 1726867630.35589: variable 'interface' from source: play vars 30575 1726867630.35722: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867630.35726: when evaluation is False, skipping this task 30575 1726867630.35728: _execute() done 30575 1726867630.35730: dumping result to json 30575 1726867630.35732: done dumping result, returning 30575 1726867630.35734: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001466] 30575 1726867630.35736: sending task result for task 0affcac9-a3a5-e081-a588-000000001466 30575 1726867630.35818: done sending task result for task 0affcac9-a3a5-e081-a588-000000001466 30575 1726867630.35821: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867630.36182: no more pending results, returning what we have 30575 1726867630.36186: results queue empty 30575 1726867630.36187: checking for any_errors_fatal 30575 1726867630.36193: done checking for any_errors_fatal 30575 1726867630.36194: checking for max_fail_percentage 30575 1726867630.36196: done checking for max_fail_percentage 30575 1726867630.36197: checking to see if all hosts have failed and the running result is not ok 30575 1726867630.36198: done checking to see if all hosts have failed 30575 1726867630.36198: getting the remaining hosts for this loop 30575 1726867630.36199: done getting the remaining hosts for this loop 30575 1726867630.36203: getting the next task for host managed_node3 30575 1726867630.36213: done getting next task for host managed_node3 30575 1726867630.36218: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867630.36223: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867630.36242: getting variables 30575 1726867630.36243: in VariableManager get_vars() 30575 1726867630.36282: Calling all_inventory to load vars for managed_node3 30575 1726867630.36285: Calling groups_inventory to load vars for managed_node3 30575 1726867630.36287: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867630.36296: Calling all_plugins_play to load vars for managed_node3 30575 1726867630.36299: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867630.36301: Calling groups_plugins_play to load vars for managed_node3 30575 1726867630.44530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867630.46105: done with get_vars() 30575 1726867630.46134: done getting variables 30575 1726867630.46194: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:10 -0400 (0:00:00.173) 0:01:05.839 ****** 30575 1726867630.46226: entering _queue_task() for managed_node3/package 30575 1726867630.46733: worker is 1 (out of 1 available) 30575 1726867630.46744: exiting _queue_task() for managed_node3/package 30575 1726867630.46754: done queuing things up, now waiting for results queue to drain 30575 1726867630.46756: waiting for pending results... 30575 1726867630.47013: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867630.47201: in run() - task 0affcac9-a3a5-e081-a588-000000001467 30575 1726867630.47204: variable 'ansible_search_path' from source: unknown 30575 1726867630.47208: variable 'ansible_search_path' from source: unknown 30575 1726867630.47266: calling self._execute() 30575 1726867630.47350: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.47376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.47487: variable 'omit' from source: magic vars 30575 1726867630.47800: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.47825: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867630.48018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867630.48306: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867630.48364: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867630.48433: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867630.48483: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867630.48605: variable 'network_packages' from source: role '' defaults 30575 1726867630.48727: variable '__network_provider_setup' from source: role '' defaults 30575 1726867630.48745: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867630.48882: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867630.48886: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867630.48890: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867630.49085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867630.51106: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867630.51167: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867630.51273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867630.51308: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867630.51353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867630.51436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.51464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.51584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.51588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.51591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.51611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.51626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.51650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.51694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.51713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.51959: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867630.52067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.52093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.52121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.52274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.52279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.52282: variable 'ansible_python' from source: facts 30575 1726867630.52284: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867630.52365: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867630.52443: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867630.52570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.52600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.52626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.52663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.52681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.52834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.52846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.52849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.52852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.52854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.52980: variable 'network_connections' from source: include params 30575 1726867630.52992: variable 'interface' from source: play vars 30575 1726867630.53097: variable 'interface' from source: play vars 30575 1726867630.53171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867630.53200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867630.53237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.53271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867630.53381: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867630.53609: variable 'network_connections' from source: include params 30575 1726867630.53620: variable 'interface' from source: play vars 30575 1726867630.53719: variable 'interface' from source: play vars 30575 1726867630.53749: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867630.53835: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867630.54133: variable 'network_connections' from source: include params 30575 1726867630.54140: variable 'interface' from source: play vars 30575 1726867630.54195: variable 'interface' from source: play vars 30575 1726867630.54221: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867630.54357: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867630.54587: variable 'network_connections' from source: include params 30575 1726867630.54590: variable 'interface' from source: play vars 30575 1726867630.54653: variable 'interface' from source: play vars 30575 1726867630.54704: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867630.54763: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867630.54783: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867630.54831: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867630.55140: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867630.55558: variable 'network_connections' from source: include params 30575 1726867630.55561: variable 'interface' from source: play vars 30575 1726867630.55623: variable 'interface' from source: play vars 30575 1726867630.55636: variable 'ansible_distribution' from source: facts 30575 1726867630.55639: variable '__network_rh_distros' from source: role '' defaults 30575 1726867630.55645: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.55664: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867630.55825: variable 'ansible_distribution' from source: facts 30575 1726867630.55829: variable '__network_rh_distros' from source: role '' defaults 30575 1726867630.55834: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.55853: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867630.56015: variable 'ansible_distribution' from source: facts 30575 1726867630.56051: variable '__network_rh_distros' from source: role '' defaults 30575 1726867630.56055: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.56060: variable 'network_provider' from source: set_fact 30575 1726867630.56076: variable 'ansible_facts' from source: unknown 30575 1726867630.56862: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867630.56865: when evaluation is False, skipping this task 30575 1726867630.56867: _execute() done 30575 1726867630.56870: dumping result to json 30575 1726867630.56871: done dumping result, returning 30575 1726867630.56875: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000001467] 30575 1726867630.56878: sending task result for task 0affcac9-a3a5-e081-a588-000000001467 30575 1726867630.56941: done sending task result for task 0affcac9-a3a5-e081-a588-000000001467 30575 1726867630.56944: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867630.56993: no more pending results, returning what we have 30575 1726867630.56996: results queue empty 30575 1726867630.56997: checking for any_errors_fatal 30575 1726867630.57006: done checking for any_errors_fatal 30575 1726867630.57006: checking for max_fail_percentage 30575 1726867630.57008: done checking for max_fail_percentage 30575 1726867630.57009: checking to see if all hosts have failed and the running result is not ok 30575 1726867630.57010: done checking to see if all hosts have failed 30575 1726867630.57010: getting the remaining hosts for this loop 30575 1726867630.57012: done getting the remaining hosts for this loop 30575 1726867630.57016: getting the next task for host managed_node3 30575 1726867630.57024: done getting next task for host managed_node3 30575 1726867630.57028: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867630.57033: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867630.57054: getting variables 30575 1726867630.57055: in VariableManager get_vars() 30575 1726867630.57101: Calling all_inventory to load vars for managed_node3 30575 1726867630.57103: Calling groups_inventory to load vars for managed_node3 30575 1726867630.57106: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867630.57115: Calling all_plugins_play to load vars for managed_node3 30575 1726867630.57117: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867630.57120: Calling groups_plugins_play to load vars for managed_node3 30575 1726867630.58566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867630.60172: done with get_vars() 30575 1726867630.60196: done getting variables 30575 1726867630.60256: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:10 -0400 (0:00:00.140) 0:01:05.980 ****** 30575 1726867630.60293: entering _queue_task() for managed_node3/package 30575 1726867630.60610: worker is 1 (out of 1 available) 30575 1726867630.60624: exiting _queue_task() for managed_node3/package 30575 1726867630.60637: done queuing things up, now waiting for results queue to drain 30575 1726867630.60639: waiting for pending results... 30575 1726867630.61095: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867630.61100: in run() - task 0affcac9-a3a5-e081-a588-000000001468 30575 1726867630.61116: variable 'ansible_search_path' from source: unknown 30575 1726867630.61124: variable 'ansible_search_path' from source: unknown 30575 1726867630.61163: calling self._execute() 30575 1726867630.61268: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.61283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.61298: variable 'omit' from source: magic vars 30575 1726867630.61683: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.61760: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867630.61831: variable 'network_state' from source: role '' defaults 30575 1726867630.61847: Evaluated conditional (network_state != {}): False 30575 1726867630.61854: when evaluation is False, skipping this task 30575 1726867630.61861: _execute() done 30575 1726867630.61872: dumping result to json 30575 1726867630.61881: done dumping result, returning 30575 1726867630.62085: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001468] 30575 1726867630.62089: sending task result for task 0affcac9-a3a5-e081-a588-000000001468 30575 1726867630.62165: done sending task result for task 0affcac9-a3a5-e081-a588-000000001468 30575 1726867630.62168: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867630.62215: no more pending results, returning what we have 30575 1726867630.62219: results queue empty 30575 1726867630.62219: checking for any_errors_fatal 30575 1726867630.62224: done checking for any_errors_fatal 30575 1726867630.62225: checking for max_fail_percentage 30575 1726867630.62227: done checking for max_fail_percentage 30575 1726867630.62228: checking to see if all hosts have failed and the running result is not ok 30575 1726867630.62229: done checking to see if all hosts have failed 30575 1726867630.62230: getting the remaining hosts for this loop 30575 1726867630.62231: done getting the remaining hosts for this loop 30575 1726867630.62235: getting the next task for host managed_node3 30575 1726867630.62243: done getting next task for host managed_node3 30575 1726867630.62247: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867630.62252: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867630.62272: getting variables 30575 1726867630.62274: in VariableManager get_vars() 30575 1726867630.62316: Calling all_inventory to load vars for managed_node3 30575 1726867630.62319: Calling groups_inventory to load vars for managed_node3 30575 1726867630.62321: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867630.62332: Calling all_plugins_play to load vars for managed_node3 30575 1726867630.62335: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867630.62338: Calling groups_plugins_play to load vars for managed_node3 30575 1726867630.63875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867630.65396: done with get_vars() 30575 1726867630.65411: done getting variables 30575 1726867630.65457: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:10 -0400 (0:00:00.051) 0:01:06.032 ****** 30575 1726867630.65484: entering _queue_task() for managed_node3/package 30575 1726867630.65721: worker is 1 (out of 1 available) 30575 1726867630.65734: exiting _queue_task() for managed_node3/package 30575 1726867630.65748: done queuing things up, now waiting for results queue to drain 30575 1726867630.65749: waiting for pending results... 30575 1726867630.65939: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867630.66043: in run() - task 0affcac9-a3a5-e081-a588-000000001469 30575 1726867630.66055: variable 'ansible_search_path' from source: unknown 30575 1726867630.66059: variable 'ansible_search_path' from source: unknown 30575 1726867630.66092: calling self._execute() 30575 1726867630.66171: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.66175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.66188: variable 'omit' from source: magic vars 30575 1726867630.66459: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.66468: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867630.66556: variable 'network_state' from source: role '' defaults 30575 1726867630.66564: Evaluated conditional (network_state != {}): False 30575 1726867630.66568: when evaluation is False, skipping this task 30575 1726867630.66571: _execute() done 30575 1726867630.66574: dumping result to json 30575 1726867630.66576: done dumping result, returning 30575 1726867630.66585: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001469] 30575 1726867630.66591: sending task result for task 0affcac9-a3a5-e081-a588-000000001469 30575 1726867630.66690: done sending task result for task 0affcac9-a3a5-e081-a588-000000001469 30575 1726867630.66692: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867630.66742: no more pending results, returning what we have 30575 1726867630.66745: results queue empty 30575 1726867630.66746: checking for any_errors_fatal 30575 1726867630.66752: done checking for any_errors_fatal 30575 1726867630.66753: checking for max_fail_percentage 30575 1726867630.66754: done checking for max_fail_percentage 30575 1726867630.66755: checking to see if all hosts have failed and the running result is not ok 30575 1726867630.66756: done checking to see if all hosts have failed 30575 1726867630.66757: getting the remaining hosts for this loop 30575 1726867630.66758: done getting the remaining hosts for this loop 30575 1726867630.66761: getting the next task for host managed_node3 30575 1726867630.66769: done getting next task for host managed_node3 30575 1726867630.66773: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867630.66783: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867630.66805: getting variables 30575 1726867630.66806: in VariableManager get_vars() 30575 1726867630.66842: Calling all_inventory to load vars for managed_node3 30575 1726867630.66844: Calling groups_inventory to load vars for managed_node3 30575 1726867630.66846: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867630.66854: Calling all_plugins_play to load vars for managed_node3 30575 1726867630.66857: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867630.66859: Calling groups_plugins_play to load vars for managed_node3 30575 1726867630.68647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867630.69528: done with get_vars() 30575 1726867630.69543: done getting variables 30575 1726867630.69587: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:10 -0400 (0:00:00.041) 0:01:06.073 ****** 30575 1726867630.69613: entering _queue_task() for managed_node3/service 30575 1726867630.69838: worker is 1 (out of 1 available) 30575 1726867630.69851: exiting _queue_task() for managed_node3/service 30575 1726867630.69864: done queuing things up, now waiting for results queue to drain 30575 1726867630.69865: waiting for pending results... 30575 1726867630.70046: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867630.70142: in run() - task 0affcac9-a3a5-e081-a588-00000000146a 30575 1726867630.70155: variable 'ansible_search_path' from source: unknown 30575 1726867630.70159: variable 'ansible_search_path' from source: unknown 30575 1726867630.70189: calling self._execute() 30575 1726867630.70260: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.70264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.70271: variable 'omit' from source: magic vars 30575 1726867630.70984: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.70988: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867630.71076: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867630.71411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867630.74080: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867630.74161: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867630.74209: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867630.74250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867630.74281: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867630.74522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.74526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.74529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.74585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.75014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.75018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.75021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.75024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.75085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.75089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.75132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.75160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.75305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.75349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.75365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.75821: variable 'network_connections' from source: include params 30575 1726867630.75843: variable 'interface' from source: play vars 30575 1726867630.75917: variable 'interface' from source: play vars 30575 1726867630.76165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867630.76403: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867630.76481: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867630.76531: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867630.76565: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867630.76623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867630.76702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867630.76705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.76716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867630.76770: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867630.77044: variable 'network_connections' from source: include params 30575 1726867630.77055: variable 'interface' from source: play vars 30575 1726867630.77123: variable 'interface' from source: play vars 30575 1726867630.77159: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867630.77167: when evaluation is False, skipping this task 30575 1726867630.77249: _execute() done 30575 1726867630.77253: dumping result to json 30575 1726867630.77255: done dumping result, returning 30575 1726867630.77258: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000146a] 30575 1726867630.77260: sending task result for task 0affcac9-a3a5-e081-a588-00000000146a 30575 1726867630.77333: done sending task result for task 0affcac9-a3a5-e081-a588-00000000146a 30575 1726867630.77343: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867630.77402: no more pending results, returning what we have 30575 1726867630.77406: results queue empty 30575 1726867630.77407: checking for any_errors_fatal 30575 1726867630.77416: done checking for any_errors_fatal 30575 1726867630.77417: checking for max_fail_percentage 30575 1726867630.77419: done checking for max_fail_percentage 30575 1726867630.77420: checking to see if all hosts have failed and the running result is not ok 30575 1726867630.77421: done checking to see if all hosts have failed 30575 1726867630.77422: getting the remaining hosts for this loop 30575 1726867630.77423: done getting the remaining hosts for this loop 30575 1726867630.77427: getting the next task for host managed_node3 30575 1726867630.77437: done getting next task for host managed_node3 30575 1726867630.77441: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867630.77446: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867630.77474: getting variables 30575 1726867630.77476: in VariableManager get_vars() 30575 1726867630.77524: Calling all_inventory to load vars for managed_node3 30575 1726867630.77527: Calling groups_inventory to load vars for managed_node3 30575 1726867630.77530: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867630.77540: Calling all_plugins_play to load vars for managed_node3 30575 1726867630.77544: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867630.77547: Calling groups_plugins_play to load vars for managed_node3 30575 1726867630.79495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867630.80959: done with get_vars() 30575 1726867630.80983: done getting variables 30575 1726867630.81043: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:10 -0400 (0:00:00.114) 0:01:06.188 ****** 30575 1726867630.81085: entering _queue_task() for managed_node3/service 30575 1726867630.81329: worker is 1 (out of 1 available) 30575 1726867630.81345: exiting _queue_task() for managed_node3/service 30575 1726867630.81359: done queuing things up, now waiting for results queue to drain 30575 1726867630.81360: waiting for pending results... 30575 1726867630.81556: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867630.81647: in run() - task 0affcac9-a3a5-e081-a588-00000000146b 30575 1726867630.81659: variable 'ansible_search_path' from source: unknown 30575 1726867630.81664: variable 'ansible_search_path' from source: unknown 30575 1726867630.81695: calling self._execute() 30575 1726867630.81784: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.81790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.81800: variable 'omit' from source: magic vars 30575 1726867630.82249: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.82482: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867630.82486: variable 'network_provider' from source: set_fact 30575 1726867630.82488: variable 'network_state' from source: role '' defaults 30575 1726867630.82490: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867630.82493: variable 'omit' from source: magic vars 30575 1726867630.82546: variable 'omit' from source: magic vars 30575 1726867630.82575: variable 'network_service_name' from source: role '' defaults 30575 1726867630.82655: variable 'network_service_name' from source: role '' defaults 30575 1726867630.82776: variable '__network_provider_setup' from source: role '' defaults 30575 1726867630.82782: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867630.82831: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867630.82840: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867630.82890: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867630.83036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867630.84480: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867630.84535: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867630.84561: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867630.84591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867630.84611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867630.84669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.84704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.84727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.84753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.84764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.84861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.84864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.84866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.84986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.84990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.85135: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867630.85243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.85266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.85293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.85412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.85417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.85505: variable 'ansible_python' from source: facts 30575 1726867630.85539: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867630.85619: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867630.85738: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867630.85848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.85888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.85918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.86095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.86099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.86102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867630.86153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867630.86156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.86158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867630.86161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867630.86255: variable 'network_connections' from source: include params 30575 1726867630.86260: variable 'interface' from source: play vars 30575 1726867630.86325: variable 'interface' from source: play vars 30575 1726867630.86395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867630.86528: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867630.86565: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867630.86597: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867630.86629: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867630.86674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867630.86697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867630.86721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867630.86743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867630.86783: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867630.86960: variable 'network_connections' from source: include params 30575 1726867630.86967: variable 'interface' from source: play vars 30575 1726867630.87023: variable 'interface' from source: play vars 30575 1726867630.87045: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867630.87101: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867630.87285: variable 'network_connections' from source: include params 30575 1726867630.87288: variable 'interface' from source: play vars 30575 1726867630.87342: variable 'interface' from source: play vars 30575 1726867630.87357: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867630.87415: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867630.87598: variable 'network_connections' from source: include params 30575 1726867630.87601: variable 'interface' from source: play vars 30575 1726867630.87654: variable 'interface' from source: play vars 30575 1726867630.87692: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867630.87739: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867630.87744: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867630.87802: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867630.88183: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867630.88635: variable 'network_connections' from source: include params 30575 1726867630.88639: variable 'interface' from source: play vars 30575 1726867630.88740: variable 'interface' from source: play vars 30575 1726867630.88743: variable 'ansible_distribution' from source: facts 30575 1726867630.88746: variable '__network_rh_distros' from source: role '' defaults 30575 1726867630.88748: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.88750: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867630.88921: variable 'ansible_distribution' from source: facts 30575 1726867630.88925: variable '__network_rh_distros' from source: role '' defaults 30575 1726867630.88927: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.88929: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867630.89072: variable 'ansible_distribution' from source: facts 30575 1726867630.89075: variable '__network_rh_distros' from source: role '' defaults 30575 1726867630.89082: variable 'ansible_distribution_major_version' from source: facts 30575 1726867630.89180: variable 'network_provider' from source: set_fact 30575 1726867630.89183: variable 'omit' from source: magic vars 30575 1726867630.89186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867630.89188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867630.89202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867630.89221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867630.89231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867630.89263: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867630.89266: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.89269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.89363: Set connection var ansible_pipelining to False 30575 1726867630.89367: Set connection var ansible_shell_type to sh 30575 1726867630.89381: Set connection var ansible_shell_executable to /bin/sh 30575 1726867630.89384: Set connection var ansible_timeout to 10 30575 1726867630.89484: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867630.89487: Set connection var ansible_connection to ssh 30575 1726867630.89490: variable 'ansible_shell_executable' from source: unknown 30575 1726867630.89492: variable 'ansible_connection' from source: unknown 30575 1726867630.89496: variable 'ansible_module_compression' from source: unknown 30575 1726867630.89498: variable 'ansible_shell_type' from source: unknown 30575 1726867630.89500: variable 'ansible_shell_executable' from source: unknown 30575 1726867630.89502: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867630.89504: variable 'ansible_pipelining' from source: unknown 30575 1726867630.89506: variable 'ansible_timeout' from source: unknown 30575 1726867630.89508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867630.89629: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867630.89636: variable 'omit' from source: magic vars 30575 1726867630.89639: starting attempt loop 30575 1726867630.89641: running the handler 30575 1726867630.89643: variable 'ansible_facts' from source: unknown 30575 1726867630.90381: _low_level_execute_command(): starting 30575 1726867630.90406: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867630.91194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867630.91240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867630.91276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867630.91395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867630.93057: stdout chunk (state=3): >>>/root <<< 30575 1726867630.93220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867630.93224: stdout chunk (state=3): >>><<< 30575 1726867630.93226: stderr chunk (state=3): >>><<< 30575 1726867630.93248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867630.93350: _low_level_execute_command(): starting 30575 1726867630.93354: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055 `" && echo ansible-tmp-1726867630.932548-33699-46278778636055="` echo /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055 `" ) && sleep 0' 30575 1726867630.93828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867630.93837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867630.93878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867630.93884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867630.93886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867630.93889: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867630.93891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867630.93915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867630.93927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867630.93972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867630.93975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867630.93983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867630.94029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867630.95951: stdout chunk (state=3): >>>ansible-tmp-1726867630.932548-33699-46278778636055=/root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055 <<< 30575 1726867630.96067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867630.96081: stderr chunk (state=3): >>><<< 30575 1726867630.96084: stdout chunk (state=3): >>><<< 30575 1726867630.96099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867630.932548-33699-46278778636055=/root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867630.96124: variable 'ansible_module_compression' from source: unknown 30575 1726867630.96160: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867630.96215: variable 'ansible_facts' from source: unknown 30575 1726867630.96348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/AnsiballZ_systemd.py 30575 1726867630.96444: Sending initial data 30575 1726867630.96447: Sent initial data (154 bytes) 30575 1726867630.96851: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867630.96855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867630.96863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867630.96906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867630.96909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867630.96957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867630.98504: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867630.98511: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867630.98546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867630.98591: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpirwyhpaj /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/AnsiballZ_systemd.py <<< 30575 1726867630.98598: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/AnsiballZ_systemd.py" <<< 30575 1726867630.98636: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpirwyhpaj" to remote "/root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/AnsiballZ_systemd.py" <<< 30575 1726867630.99710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867630.99756: stderr chunk (state=3): >>><<< 30575 1726867630.99759: stdout chunk (state=3): >>><<< 30575 1726867630.99787: done transferring module to remote 30575 1726867630.99796: _low_level_execute_command(): starting 30575 1726867630.99801: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/ /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/AnsiballZ_systemd.py && sleep 0' 30575 1726867631.00235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867631.00239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867631.00268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867631.00271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867631.00274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867631.00276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867631.00330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867631.00334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867631.00342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867631.00402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867631.02148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867631.02172: stderr chunk (state=3): >>><<< 30575 1726867631.02175: stdout chunk (state=3): >>><<< 30575 1726867631.02189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867631.02192: _low_level_execute_command(): starting 30575 1726867631.02198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/AnsiballZ_systemd.py && sleep 0' 30575 1726867631.02616: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867631.02620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867631.02637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867631.02641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867631.02683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867631.02704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867631.02749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867631.32287: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10563584", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316097024", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1873748000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867631.33835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867631.33842: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867631.33923: stderr chunk (state=3): >>><<< 30575 1726867631.33996: stdout chunk (state=3): >>><<< 30575 1726867631.34021: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10563584", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316097024", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1873748000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867631.34418: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867631.34441: _low_level_execute_command(): starting 30575 1726867631.34444: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867630.932548-33699-46278778636055/ > /dev/null 2>&1 && sleep 0' 30575 1726867631.35883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867631.35887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867631.35889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867631.35892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867631.35894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867631.35896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867631.36485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867631.36488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867631.36503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867631.36574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867631.38642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867631.38651: stdout chunk (state=3): >>><<< 30575 1726867631.38660: stderr chunk (state=3): >>><<< 30575 1726867631.38676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867631.38692: handler run complete 30575 1726867631.38763: attempt loop complete, returning result 30575 1726867631.38983: _execute() done 30575 1726867631.38986: dumping result to json 30575 1726867631.38988: done dumping result, returning 30575 1726867631.38991: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-00000000146b] 30575 1726867631.38993: sending task result for task 0affcac9-a3a5-e081-a588-00000000146b ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867631.39480: no more pending results, returning what we have 30575 1726867631.39483: results queue empty 30575 1726867631.39484: checking for any_errors_fatal 30575 1726867631.39490: done checking for any_errors_fatal 30575 1726867631.39491: checking for max_fail_percentage 30575 1726867631.39492: done checking for max_fail_percentage 30575 1726867631.39493: checking to see if all hosts have failed and the running result is not ok 30575 1726867631.39494: done checking to see if all hosts have failed 30575 1726867631.39495: getting the remaining hosts for this loop 30575 1726867631.39496: done getting the remaining hosts for this loop 30575 1726867631.39499: getting the next task for host managed_node3 30575 1726867631.39507: done getting next task for host managed_node3 30575 1726867631.39510: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867631.39516: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867631.39528: getting variables 30575 1726867631.39529: in VariableManager get_vars() 30575 1726867631.39563: Calling all_inventory to load vars for managed_node3 30575 1726867631.39565: Calling groups_inventory to load vars for managed_node3 30575 1726867631.39567: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867631.39576: Calling all_plugins_play to load vars for managed_node3 30575 1726867631.39783: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867631.39788: Calling groups_plugins_play to load vars for managed_node3 30575 1726867631.41095: done sending task result for task 0affcac9-a3a5-e081-a588-00000000146b 30575 1726867631.41100: WORKER PROCESS EXITING 30575 1726867631.42432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867631.45581: done with get_vars() 30575 1726867631.45606: done getting variables 30575 1726867631.45663: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:11 -0400 (0:00:00.648) 0:01:06.836 ****** 30575 1726867631.45911: entering _queue_task() for managed_node3/service 30575 1726867631.46460: worker is 1 (out of 1 available) 30575 1726867631.46473: exiting _queue_task() for managed_node3/service 30575 1726867631.46689: done queuing things up, now waiting for results queue to drain 30575 1726867631.46692: waiting for pending results... 30575 1726867631.47076: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867631.47415: in run() - task 0affcac9-a3a5-e081-a588-00000000146c 30575 1726867631.47431: variable 'ansible_search_path' from source: unknown 30575 1726867631.47436: variable 'ansible_search_path' from source: unknown 30575 1726867631.47469: calling self._execute() 30575 1726867631.47566: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867631.47571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867631.47580: variable 'omit' from source: magic vars 30575 1726867631.48211: variable 'ansible_distribution_major_version' from source: facts 30575 1726867631.48226: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867631.48343: variable 'network_provider' from source: set_fact 30575 1726867631.48349: Evaluated conditional (network_provider == "nm"): True 30575 1726867631.48449: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867631.48546: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867631.48719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867631.51318: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867631.51583: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867631.51587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867631.51589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867631.51591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867631.51684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867631.51818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867631.52182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867631.52186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867631.52189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867631.52191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867631.52194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867631.52196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867631.52408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867631.52434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867631.52483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867631.52515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867631.52545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867631.52824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867631.52982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867631.52993: variable 'network_connections' from source: include params 30575 1726867631.53009: variable 'interface' from source: play vars 30575 1726867631.53084: variable 'interface' from source: play vars 30575 1726867631.53356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867631.53529: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867631.53982: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867631.53984: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867631.53986: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867631.53988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867631.53990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867631.53992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867631.54198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867631.54249: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867631.54688: variable 'network_connections' from source: include params 30575 1726867631.54698: variable 'interface' from source: play vars 30575 1726867631.54763: variable 'interface' from source: play vars 30575 1726867631.55182: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867631.55185: when evaluation is False, skipping this task 30575 1726867631.55188: _execute() done 30575 1726867631.55190: dumping result to json 30575 1726867631.55192: done dumping result, returning 30575 1726867631.55194: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-00000000146c] 30575 1726867631.55204: sending task result for task 0affcac9-a3a5-e081-a588-00000000146c 30575 1726867631.55279: done sending task result for task 0affcac9-a3a5-e081-a588-00000000146c 30575 1726867631.55282: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867631.55327: no more pending results, returning what we have 30575 1726867631.55330: results queue empty 30575 1726867631.55331: checking for any_errors_fatal 30575 1726867631.55354: done checking for any_errors_fatal 30575 1726867631.55355: checking for max_fail_percentage 30575 1726867631.55357: done checking for max_fail_percentage 30575 1726867631.55358: checking to see if all hosts have failed and the running result is not ok 30575 1726867631.55359: done checking to see if all hosts have failed 30575 1726867631.55360: getting the remaining hosts for this loop 30575 1726867631.55361: done getting the remaining hosts for this loop 30575 1726867631.55365: getting the next task for host managed_node3 30575 1726867631.55374: done getting next task for host managed_node3 30575 1726867631.55379: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867631.55385: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867631.55408: getting variables 30575 1726867631.55409: in VariableManager get_vars() 30575 1726867631.55450: Calling all_inventory to load vars for managed_node3 30575 1726867631.55452: Calling groups_inventory to load vars for managed_node3 30575 1726867631.55455: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867631.55464: Calling all_plugins_play to load vars for managed_node3 30575 1726867631.55467: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867631.55470: Calling groups_plugins_play to load vars for managed_node3 30575 1726867631.58266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867631.60486: done with get_vars() 30575 1726867631.60514: done getting variables 30575 1726867631.60585: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:11 -0400 (0:00:00.147) 0:01:06.983 ****** 30575 1726867631.60624: entering _queue_task() for managed_node3/service 30575 1726867631.60973: worker is 1 (out of 1 available) 30575 1726867631.61190: exiting _queue_task() for managed_node3/service 30575 1726867631.61201: done queuing things up, now waiting for results queue to drain 30575 1726867631.61202: waiting for pending results... 30575 1726867631.61320: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867631.61498: in run() - task 0affcac9-a3a5-e081-a588-00000000146d 30575 1726867631.61528: variable 'ansible_search_path' from source: unknown 30575 1726867631.61538: variable 'ansible_search_path' from source: unknown 30575 1726867631.61606: calling self._execute() 30575 1726867631.61886: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867631.61899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867631.61916: variable 'omit' from source: magic vars 30575 1726867631.62355: variable 'ansible_distribution_major_version' from source: facts 30575 1726867631.62373: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867631.62507: variable 'network_provider' from source: set_fact 30575 1726867631.62600: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867631.62604: when evaluation is False, skipping this task 30575 1726867631.62607: _execute() done 30575 1726867631.62609: dumping result to json 30575 1726867631.62614: done dumping result, returning 30575 1726867631.62617: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-00000000146d] 30575 1726867631.62619: sending task result for task 0affcac9-a3a5-e081-a588-00000000146d skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867631.62842: no more pending results, returning what we have 30575 1726867631.62847: results queue empty 30575 1726867631.62848: checking for any_errors_fatal 30575 1726867631.62857: done checking for any_errors_fatal 30575 1726867631.62858: checking for max_fail_percentage 30575 1726867631.62860: done checking for max_fail_percentage 30575 1726867631.62861: checking to see if all hosts have failed and the running result is not ok 30575 1726867631.62862: done checking to see if all hosts have failed 30575 1726867631.62863: getting the remaining hosts for this loop 30575 1726867631.62864: done getting the remaining hosts for this loop 30575 1726867631.62869: getting the next task for host managed_node3 30575 1726867631.62879: done getting next task for host managed_node3 30575 1726867631.62883: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867631.62890: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867631.62917: getting variables 30575 1726867631.62920: in VariableManager get_vars() 30575 1726867631.62964: Calling all_inventory to load vars for managed_node3 30575 1726867631.62967: Calling groups_inventory to load vars for managed_node3 30575 1726867631.62969: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867631.63096: Calling all_plugins_play to load vars for managed_node3 30575 1726867631.63100: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867631.63109: Calling groups_plugins_play to load vars for managed_node3 30575 1726867631.63670: done sending task result for task 0affcac9-a3a5-e081-a588-00000000146d 30575 1726867631.63673: WORKER PROCESS EXITING 30575 1726867631.65644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867631.67828: done with get_vars() 30575 1726867631.67853: done getting variables 30575 1726867631.67974: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:11 -0400 (0:00:00.074) 0:01:07.057 ****** 30575 1726867631.68038: entering _queue_task() for managed_node3/copy 30575 1726867631.68708: worker is 1 (out of 1 available) 30575 1726867631.68719: exiting _queue_task() for managed_node3/copy 30575 1726867631.68729: done queuing things up, now waiting for results queue to drain 30575 1726867631.68731: waiting for pending results... 30575 1726867631.68814: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867631.69032: in run() - task 0affcac9-a3a5-e081-a588-00000000146e 30575 1726867631.69052: variable 'ansible_search_path' from source: unknown 30575 1726867631.69071: variable 'ansible_search_path' from source: unknown 30575 1726867631.69116: calling self._execute() 30575 1726867631.69287: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867631.69291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867631.69293: variable 'omit' from source: magic vars 30575 1726867631.70117: variable 'ansible_distribution_major_version' from source: facts 30575 1726867631.70125: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867631.70236: variable 'network_provider' from source: set_fact 30575 1726867631.70242: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867631.70245: when evaluation is False, skipping this task 30575 1726867631.70248: _execute() done 30575 1726867631.70251: dumping result to json 30575 1726867631.70255: done dumping result, returning 30575 1726867631.70265: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-00000000146e] 30575 1726867631.70290: sending task result for task 0affcac9-a3a5-e081-a588-00000000146e skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867631.70732: no more pending results, returning what we have 30575 1726867631.70736: results queue empty 30575 1726867631.70736: checking for any_errors_fatal 30575 1726867631.70741: done checking for any_errors_fatal 30575 1726867631.70742: checking for max_fail_percentage 30575 1726867631.70743: done checking for max_fail_percentage 30575 1726867631.70744: checking to see if all hosts have failed and the running result is not ok 30575 1726867631.70745: done checking to see if all hosts have failed 30575 1726867631.70745: getting the remaining hosts for this loop 30575 1726867631.70746: done getting the remaining hosts for this loop 30575 1726867631.70749: getting the next task for host managed_node3 30575 1726867631.70759: done getting next task for host managed_node3 30575 1726867631.70763: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867631.70767: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867631.70790: getting variables 30575 1726867631.70791: in VariableManager get_vars() 30575 1726867631.70825: Calling all_inventory to load vars for managed_node3 30575 1726867631.70827: Calling groups_inventory to load vars for managed_node3 30575 1726867631.70829: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867631.70837: Calling all_plugins_play to load vars for managed_node3 30575 1726867631.70839: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867631.70841: Calling groups_plugins_play to load vars for managed_node3 30575 1726867631.71420: done sending task result for task 0affcac9-a3a5-e081-a588-00000000146e 30575 1726867631.71423: WORKER PROCESS EXITING 30575 1726867631.72545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867631.74333: done with get_vars() 30575 1726867631.74354: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:11 -0400 (0:00:00.064) 0:01:07.124 ****** 30575 1726867631.74688: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867631.75169: worker is 1 (out of 1 available) 30575 1726867631.75185: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867631.75199: done queuing things up, now waiting for results queue to drain 30575 1726867631.75201: waiting for pending results... 30575 1726867631.75607: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867631.75776: in run() - task 0affcac9-a3a5-e081-a588-00000000146f 30575 1726867631.75802: variable 'ansible_search_path' from source: unknown 30575 1726867631.75811: variable 'ansible_search_path' from source: unknown 30575 1726867631.75856: calling self._execute() 30575 1726867631.75966: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867631.75984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867631.76001: variable 'omit' from source: magic vars 30575 1726867631.76382: variable 'ansible_distribution_major_version' from source: facts 30575 1726867631.76401: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867631.76419: variable 'omit' from source: magic vars 30575 1726867631.76494: variable 'omit' from source: magic vars 30575 1726867631.76652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867631.78820: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867631.78887: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867631.78934: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867631.78973: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867631.79006: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867631.79093: variable 'network_provider' from source: set_fact 30575 1726867631.79223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867631.79262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867631.79295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867631.79343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867631.79363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867631.79438: variable 'omit' from source: magic vars 30575 1726867631.79549: variable 'omit' from source: magic vars 30575 1726867631.79657: variable 'network_connections' from source: include params 30575 1726867631.79679: variable 'interface' from source: play vars 30575 1726867631.79781: variable 'interface' from source: play vars 30575 1726867631.79905: variable 'omit' from source: magic vars 30575 1726867631.79918: variable '__lsr_ansible_managed' from source: task vars 30575 1726867631.79980: variable '__lsr_ansible_managed' from source: task vars 30575 1726867631.80170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867631.80427: Loaded config def from plugin (lookup/template) 30575 1726867631.80430: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867631.80432: File lookup term: get_ansible_managed.j2 30575 1726867631.80435: variable 'ansible_search_path' from source: unknown 30575 1726867631.80437: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867631.80451: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867631.80474: variable 'ansible_search_path' from source: unknown 30575 1726867631.86542: variable 'ansible_managed' from source: unknown 30575 1726867631.86673: variable 'omit' from source: magic vars 30575 1726867631.86712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867631.86810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867631.86813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867631.86816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867631.86818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867631.86830: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867631.86838: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867631.86846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867631.86968: Set connection var ansible_pipelining to False 30575 1726867631.86979: Set connection var ansible_shell_type to sh 30575 1726867631.86991: Set connection var ansible_shell_executable to /bin/sh 30575 1726867631.87002: Set connection var ansible_timeout to 10 30575 1726867631.87012: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867631.87028: Set connection var ansible_connection to ssh 30575 1726867631.87056: variable 'ansible_shell_executable' from source: unknown 30575 1726867631.87064: variable 'ansible_connection' from source: unknown 30575 1726867631.87081: variable 'ansible_module_compression' from source: unknown 30575 1726867631.87083: variable 'ansible_shell_type' from source: unknown 30575 1726867631.87085: variable 'ansible_shell_executable' from source: unknown 30575 1726867631.87088: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867631.87135: variable 'ansible_pipelining' from source: unknown 30575 1726867631.87138: variable 'ansible_timeout' from source: unknown 30575 1726867631.87140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867631.87242: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867631.87266: variable 'omit' from source: magic vars 30575 1726867631.87276: starting attempt loop 30575 1726867631.87285: running the handler 30575 1726867631.87352: _low_level_execute_command(): starting 30575 1726867631.87355: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867631.88096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867631.88115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867631.88134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867631.88220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867631.89926: stdout chunk (state=3): >>>/root <<< 30575 1726867631.90068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867631.90082: stdout chunk (state=3): >>><<< 30575 1726867631.90100: stderr chunk (state=3): >>><<< 30575 1726867631.90124: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867631.90141: _low_level_execute_command(): starting 30575 1726867631.90214: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121 `" && echo ansible-tmp-1726867631.9013042-33740-8695880384121="` echo /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121 `" ) && sleep 0' 30575 1726867631.90732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867631.90746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867631.90796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867631.90810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867631.90895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867631.90918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867631.91010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867631.92952: stdout chunk (state=3): >>>ansible-tmp-1726867631.9013042-33740-8695880384121=/root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121 <<< 30575 1726867631.93115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867631.93119: stdout chunk (state=3): >>><<< 30575 1726867631.93121: stderr chunk (state=3): >>><<< 30575 1726867631.93284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867631.9013042-33740-8695880384121=/root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867631.93287: variable 'ansible_module_compression' from source: unknown 30575 1726867631.93289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867631.93291: variable 'ansible_facts' from source: unknown 30575 1726867631.93418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/AnsiballZ_network_connections.py 30575 1726867631.93648: Sending initial data 30575 1726867631.93651: Sent initial data (166 bytes) 30575 1726867631.94519: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867631.94535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867631.94550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867631.94570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867631.94617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867631.94642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867631.94723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867631.94742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867631.94764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867631.94842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867631.96644: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867631.96692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867631.96773: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpc8qkkfg8 /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/AnsiballZ_network_connections.py <<< 30575 1726867631.96781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/AnsiballZ_network_connections.py" <<< 30575 1726867631.97263: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpc8qkkfg8" to remote "/root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/AnsiballZ_network_connections.py" <<< 30575 1726867631.99143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867631.99388: stderr chunk (state=3): >>><<< 30575 1726867631.99392: stdout chunk (state=3): >>><<< 30575 1726867631.99394: done transferring module to remote 30575 1726867631.99397: _low_level_execute_command(): starting 30575 1726867631.99399: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/ /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/AnsiballZ_network_connections.py && sleep 0' 30575 1726867632.00374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.00380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867632.00383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.00385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.00387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.00537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867632.00541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867632.00609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.00680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.02571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867632.02575: stdout chunk (state=3): >>><<< 30575 1726867632.02585: stderr chunk (state=3): >>><<< 30575 1726867632.02627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867632.02630: _low_level_execute_command(): starting 30575 1726867632.02634: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/AnsiballZ_network_connections.py && sleep 0' 30575 1726867632.03586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867632.03590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.03593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.03595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867632.03597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867632.03599: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867632.03601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.03604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867632.03607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867632.03609: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867632.03612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.03614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.03616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867632.03619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867632.03622: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867632.03879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.03883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867632.03885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.03887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.30988: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 30575 1726867632.31000: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oshhvekg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oshhvekg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/12e4c575-fa21-4cd0-afc7-2cb6b45b6219: error=unknown <<< 30575 1726867632.31204: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867632.33120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867632.33142: stderr chunk (state=3): >>><<< 30575 1726867632.33145: stdout chunk (state=3): >>><<< 30575 1726867632.33160: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oshhvekg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_oshhvekg/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/12e4c575-fa21-4cd0-afc7-2cb6b45b6219: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867632.33189: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867632.33197: _low_level_execute_command(): starting 30575 1726867632.33202: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867631.9013042-33740-8695880384121/ > /dev/null 2>&1 && sleep 0' 30575 1726867632.33591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.33598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867632.33625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.33628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.33630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.33681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867632.33685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.33762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.35912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867632.35915: stdout chunk (state=3): >>><<< 30575 1726867632.35918: stderr chunk (state=3): >>><<< 30575 1726867632.35920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867632.35926: handler run complete 30575 1726867632.35928: attempt loop complete, returning result 30575 1726867632.35930: _execute() done 30575 1726867632.36132: dumping result to json 30575 1726867632.36136: done dumping result, returning 30575 1726867632.36138: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-00000000146f] 30575 1726867632.36141: sending task result for task 0affcac9-a3a5-e081-a588-00000000146f 30575 1726867632.36222: done sending task result for task 0affcac9-a3a5-e081-a588-00000000146f 30575 1726867632.36225: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30575 1726867632.36333: no more pending results, returning what we have 30575 1726867632.36337: results queue empty 30575 1726867632.36338: checking for any_errors_fatal 30575 1726867632.36344: done checking for any_errors_fatal 30575 1726867632.36344: checking for max_fail_percentage 30575 1726867632.36346: done checking for max_fail_percentage 30575 1726867632.36347: checking to see if all hosts have failed and the running result is not ok 30575 1726867632.36349: done checking to see if all hosts have failed 30575 1726867632.36349: getting the remaining hosts for this loop 30575 1726867632.36351: done getting the remaining hosts for this loop 30575 1726867632.36354: getting the next task for host managed_node3 30575 1726867632.36363: done getting next task for host managed_node3 30575 1726867632.36366: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867632.36371: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867632.36384: getting variables 30575 1726867632.36386: in VariableManager get_vars() 30575 1726867632.36427: Calling all_inventory to load vars for managed_node3 30575 1726867632.36429: Calling groups_inventory to load vars for managed_node3 30575 1726867632.36432: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867632.36441: Calling all_plugins_play to load vars for managed_node3 30575 1726867632.36444: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867632.36447: Calling groups_plugins_play to load vars for managed_node3 30575 1726867632.38506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867632.40759: done with get_vars() 30575 1726867632.40788: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:12 -0400 (0:00:00.661) 0:01:07.786 ****** 30575 1726867632.40874: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867632.41623: worker is 1 (out of 1 available) 30575 1726867632.41636: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867632.41650: done queuing things up, now waiting for results queue to drain 30575 1726867632.41651: waiting for pending results... 30575 1726867632.42065: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867632.42147: in run() - task 0affcac9-a3a5-e081-a588-000000001470 30575 1726867632.42159: variable 'ansible_search_path' from source: unknown 30575 1726867632.42162: variable 'ansible_search_path' from source: unknown 30575 1726867632.42200: calling self._execute() 30575 1726867632.42296: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.42302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.42318: variable 'omit' from source: magic vars 30575 1726867632.42711: variable 'ansible_distribution_major_version' from source: facts 30575 1726867632.42725: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867632.42851: variable 'network_state' from source: role '' defaults 30575 1726867632.42866: Evaluated conditional (network_state != {}): False 30575 1726867632.42871: when evaluation is False, skipping this task 30575 1726867632.42874: _execute() done 30575 1726867632.42876: dumping result to json 30575 1726867632.42881: done dumping result, returning 30575 1726867632.42889: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000001470] 30575 1726867632.42894: sending task result for task 0affcac9-a3a5-e081-a588-000000001470 30575 1726867632.43089: done sending task result for task 0affcac9-a3a5-e081-a588-000000001470 30575 1726867632.43092: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867632.43171: no more pending results, returning what we have 30575 1726867632.43174: results queue empty 30575 1726867632.43175: checking for any_errors_fatal 30575 1726867632.43184: done checking for any_errors_fatal 30575 1726867632.43184: checking for max_fail_percentage 30575 1726867632.43186: done checking for max_fail_percentage 30575 1726867632.43187: checking to see if all hosts have failed and the running result is not ok 30575 1726867632.43187: done checking to see if all hosts have failed 30575 1726867632.43188: getting the remaining hosts for this loop 30575 1726867632.43189: done getting the remaining hosts for this loop 30575 1726867632.43192: getting the next task for host managed_node3 30575 1726867632.43199: done getting next task for host managed_node3 30575 1726867632.43202: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867632.43207: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867632.43227: getting variables 30575 1726867632.43228: in VariableManager get_vars() 30575 1726867632.43262: Calling all_inventory to load vars for managed_node3 30575 1726867632.43264: Calling groups_inventory to load vars for managed_node3 30575 1726867632.43266: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867632.43275: Calling all_plugins_play to load vars for managed_node3 30575 1726867632.43279: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867632.43282: Calling groups_plugins_play to load vars for managed_node3 30575 1726867632.45556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867632.47262: done with get_vars() 30575 1726867632.47286: done getting variables 30575 1726867632.47342: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:12 -0400 (0:00:00.065) 0:01:07.851 ****** 30575 1726867632.47379: entering _queue_task() for managed_node3/debug 30575 1726867632.47708: worker is 1 (out of 1 available) 30575 1726867632.47721: exiting _queue_task() for managed_node3/debug 30575 1726867632.47733: done queuing things up, now waiting for results queue to drain 30575 1726867632.47735: waiting for pending results... 30575 1726867632.48033: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867632.48182: in run() - task 0affcac9-a3a5-e081-a588-000000001471 30575 1726867632.48203: variable 'ansible_search_path' from source: unknown 30575 1726867632.48207: variable 'ansible_search_path' from source: unknown 30575 1726867632.48312: calling self._execute() 30575 1726867632.48333: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.48338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.48348: variable 'omit' from source: magic vars 30575 1726867632.48741: variable 'ansible_distribution_major_version' from source: facts 30575 1726867632.48750: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867632.48757: variable 'omit' from source: magic vars 30575 1726867632.48822: variable 'omit' from source: magic vars 30575 1726867632.48859: variable 'omit' from source: magic vars 30575 1726867632.48902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867632.48946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867632.48965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867632.48984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867632.48996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867632.49029: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867632.49032: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.49035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.49140: Set connection var ansible_pipelining to False 30575 1726867632.49144: Set connection var ansible_shell_type to sh 30575 1726867632.49153: Set connection var ansible_shell_executable to /bin/sh 30575 1726867632.49159: Set connection var ansible_timeout to 10 30575 1726867632.49165: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867632.49182: Set connection var ansible_connection to ssh 30575 1726867632.49198: variable 'ansible_shell_executable' from source: unknown 30575 1726867632.49201: variable 'ansible_connection' from source: unknown 30575 1726867632.49204: variable 'ansible_module_compression' from source: unknown 30575 1726867632.49206: variable 'ansible_shell_type' from source: unknown 30575 1726867632.49209: variable 'ansible_shell_executable' from source: unknown 30575 1726867632.49211: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.49291: variable 'ansible_pipelining' from source: unknown 30575 1726867632.49295: variable 'ansible_timeout' from source: unknown 30575 1726867632.49297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.49365: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867632.49384: variable 'omit' from source: magic vars 30575 1726867632.49389: starting attempt loop 30575 1726867632.49392: running the handler 30575 1726867632.49526: variable '__network_connections_result' from source: set_fact 30575 1726867632.49574: handler run complete 30575 1726867632.49598: attempt loop complete, returning result 30575 1726867632.49601: _execute() done 30575 1726867632.49604: dumping result to json 30575 1726867632.49607: done dumping result, returning 30575 1726867632.49727: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-000000001471] 30575 1726867632.49730: sending task result for task 0affcac9-a3a5-e081-a588-000000001471 30575 1726867632.49795: done sending task result for task 0affcac9-a3a5-e081-a588-000000001471 30575 1726867632.49798: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 30575 1726867632.49893: no more pending results, returning what we have 30575 1726867632.49895: results queue empty 30575 1726867632.49896: checking for any_errors_fatal 30575 1726867632.49901: done checking for any_errors_fatal 30575 1726867632.49902: checking for max_fail_percentage 30575 1726867632.49903: done checking for max_fail_percentage 30575 1726867632.49904: checking to see if all hosts have failed and the running result is not ok 30575 1726867632.49905: done checking to see if all hosts have failed 30575 1726867632.49905: getting the remaining hosts for this loop 30575 1726867632.49906: done getting the remaining hosts for this loop 30575 1726867632.49910: getting the next task for host managed_node3 30575 1726867632.49916: done getting next task for host managed_node3 30575 1726867632.49920: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867632.49924: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867632.49934: getting variables 30575 1726867632.49936: in VariableManager get_vars() 30575 1726867632.49970: Calling all_inventory to load vars for managed_node3 30575 1726867632.49973: Calling groups_inventory to load vars for managed_node3 30575 1726867632.49975: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867632.49984: Calling all_plugins_play to load vars for managed_node3 30575 1726867632.49987: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867632.49989: Calling groups_plugins_play to load vars for managed_node3 30575 1726867632.51279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867632.53235: done with get_vars() 30575 1726867632.53259: done getting variables 30575 1726867632.53321: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:12 -0400 (0:00:00.059) 0:01:07.911 ****** 30575 1726867632.53361: entering _queue_task() for managed_node3/debug 30575 1726867632.53680: worker is 1 (out of 1 available) 30575 1726867632.53693: exiting _queue_task() for managed_node3/debug 30575 1726867632.53706: done queuing things up, now waiting for results queue to drain 30575 1726867632.53707: waiting for pending results... 30575 1726867632.54007: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867632.54204: in run() - task 0affcac9-a3a5-e081-a588-000000001472 30575 1726867632.54208: variable 'ansible_search_path' from source: unknown 30575 1726867632.54210: variable 'ansible_search_path' from source: unknown 30575 1726867632.54221: calling self._execute() 30575 1726867632.54329: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.54341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.54354: variable 'omit' from source: magic vars 30575 1726867632.54723: variable 'ansible_distribution_major_version' from source: facts 30575 1726867632.54742: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867632.54755: variable 'omit' from source: magic vars 30575 1726867632.54858: variable 'omit' from source: magic vars 30575 1726867632.54863: variable 'omit' from source: magic vars 30575 1726867632.54906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867632.54945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867632.54973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867632.54996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867632.55013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867632.55075: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867632.55079: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.55081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.55157: Set connection var ansible_pipelining to False 30575 1726867632.55165: Set connection var ansible_shell_type to sh 30575 1726867632.55174: Set connection var ansible_shell_executable to /bin/sh 30575 1726867632.55188: Set connection var ansible_timeout to 10 30575 1726867632.55282: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867632.55286: Set connection var ansible_connection to ssh 30575 1726867632.55289: variable 'ansible_shell_executable' from source: unknown 30575 1726867632.55291: variable 'ansible_connection' from source: unknown 30575 1726867632.55293: variable 'ansible_module_compression' from source: unknown 30575 1726867632.55295: variable 'ansible_shell_type' from source: unknown 30575 1726867632.55297: variable 'ansible_shell_executable' from source: unknown 30575 1726867632.55299: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.55300: variable 'ansible_pipelining' from source: unknown 30575 1726867632.55302: variable 'ansible_timeout' from source: unknown 30575 1726867632.55304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.55404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867632.55424: variable 'omit' from source: magic vars 30575 1726867632.55434: starting attempt loop 30575 1726867632.55440: running the handler 30575 1726867632.55492: variable '__network_connections_result' from source: set_fact 30575 1726867632.55572: variable '__network_connections_result' from source: set_fact 30575 1726867632.55744: handler run complete 30575 1726867632.55747: attempt loop complete, returning result 30575 1726867632.55749: _execute() done 30575 1726867632.55751: dumping result to json 30575 1726867632.55753: done dumping result, returning 30575 1726867632.55755: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-000000001472] 30575 1726867632.55758: sending task result for task 0affcac9-a3a5-e081-a588-000000001472 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30575 1726867632.55935: no more pending results, returning what we have 30575 1726867632.55938: results queue empty 30575 1726867632.55939: checking for any_errors_fatal 30575 1726867632.55947: done checking for any_errors_fatal 30575 1726867632.55947: checking for max_fail_percentage 30575 1726867632.55949: done checking for max_fail_percentage 30575 1726867632.55950: checking to see if all hosts have failed and the running result is not ok 30575 1726867632.55951: done checking to see if all hosts have failed 30575 1726867632.55952: getting the remaining hosts for this loop 30575 1726867632.55953: done getting the remaining hosts for this loop 30575 1726867632.55957: getting the next task for host managed_node3 30575 1726867632.55966: done getting next task for host managed_node3 30575 1726867632.55970: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867632.55974: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867632.55988: getting variables 30575 1726867632.55990: in VariableManager get_vars() 30575 1726867632.56028: Calling all_inventory to load vars for managed_node3 30575 1726867632.56031: Calling groups_inventory to load vars for managed_node3 30575 1726867632.56033: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867632.56044: Calling all_plugins_play to load vars for managed_node3 30575 1726867632.56047: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867632.56050: Calling groups_plugins_play to load vars for managed_node3 30575 1726867632.56690: done sending task result for task 0affcac9-a3a5-e081-a588-000000001472 30575 1726867632.56699: WORKER PROCESS EXITING 30575 1726867632.57773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867632.59306: done with get_vars() 30575 1726867632.59326: done getting variables 30575 1726867632.59382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:12 -0400 (0:00:00.060) 0:01:07.971 ****** 30575 1726867632.59413: entering _queue_task() for managed_node3/debug 30575 1726867632.59706: worker is 1 (out of 1 available) 30575 1726867632.59718: exiting _queue_task() for managed_node3/debug 30575 1726867632.59731: done queuing things up, now waiting for results queue to drain 30575 1726867632.59733: waiting for pending results... 30575 1726867632.60016: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867632.60173: in run() - task 0affcac9-a3a5-e081-a588-000000001473 30575 1726867632.60196: variable 'ansible_search_path' from source: unknown 30575 1726867632.60205: variable 'ansible_search_path' from source: unknown 30575 1726867632.60243: calling self._execute() 30575 1726867632.60342: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.60353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.60367: variable 'omit' from source: magic vars 30575 1726867632.60736: variable 'ansible_distribution_major_version' from source: facts 30575 1726867632.60757: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867632.60883: variable 'network_state' from source: role '' defaults 30575 1726867632.60900: Evaluated conditional (network_state != {}): False 30575 1726867632.60909: when evaluation is False, skipping this task 30575 1726867632.60919: _execute() done 30575 1726867632.60928: dumping result to json 30575 1726867632.60936: done dumping result, returning 30575 1726867632.60948: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000001473] 30575 1726867632.60962: sending task result for task 0affcac9-a3a5-e081-a588-000000001473 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867632.61147: no more pending results, returning what we have 30575 1726867632.61151: results queue empty 30575 1726867632.61152: checking for any_errors_fatal 30575 1726867632.61162: done checking for any_errors_fatal 30575 1726867632.61163: checking for max_fail_percentage 30575 1726867632.61165: done checking for max_fail_percentage 30575 1726867632.61166: checking to see if all hosts have failed and the running result is not ok 30575 1726867632.61166: done checking to see if all hosts have failed 30575 1726867632.61167: getting the remaining hosts for this loop 30575 1726867632.61168: done getting the remaining hosts for this loop 30575 1726867632.61172: getting the next task for host managed_node3 30575 1726867632.61182: done getting next task for host managed_node3 30575 1726867632.61185: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867632.61190: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867632.61213: getting variables 30575 1726867632.61215: in VariableManager get_vars() 30575 1726867632.61250: Calling all_inventory to load vars for managed_node3 30575 1726867632.61252: Calling groups_inventory to load vars for managed_node3 30575 1726867632.61254: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867632.61265: Calling all_plugins_play to load vars for managed_node3 30575 1726867632.61268: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867632.61270: Calling groups_plugins_play to load vars for managed_node3 30575 1726867632.61924: done sending task result for task 0affcac9-a3a5-e081-a588-000000001473 30575 1726867632.61927: WORKER PROCESS EXITING 30575 1726867632.63240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867632.64958: done with get_vars() 30575 1726867632.64974: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:12 -0400 (0:00:00.056) 0:01:08.028 ****** 30575 1726867632.65046: entering _queue_task() for managed_node3/ping 30575 1726867632.65270: worker is 1 (out of 1 available) 30575 1726867632.65285: exiting _queue_task() for managed_node3/ping 30575 1726867632.65298: done queuing things up, now waiting for results queue to drain 30575 1726867632.65299: waiting for pending results... 30575 1726867632.65481: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867632.65564: in run() - task 0affcac9-a3a5-e081-a588-000000001474 30575 1726867632.65575: variable 'ansible_search_path' from source: unknown 30575 1726867632.65579: variable 'ansible_search_path' from source: unknown 30575 1726867632.65606: calling self._execute() 30575 1726867632.65682: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.65687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.65696: variable 'omit' from source: magic vars 30575 1726867632.65967: variable 'ansible_distribution_major_version' from source: facts 30575 1726867632.65979: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867632.65985: variable 'omit' from source: magic vars 30575 1726867632.66027: variable 'omit' from source: magic vars 30575 1726867632.66048: variable 'omit' from source: magic vars 30575 1726867632.66084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867632.66109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867632.66125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867632.66153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867632.66161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867632.66237: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867632.66240: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.66242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.66683: Set connection var ansible_pipelining to False 30575 1726867632.66686: Set connection var ansible_shell_type to sh 30575 1726867632.66688: Set connection var ansible_shell_executable to /bin/sh 30575 1726867632.66689: Set connection var ansible_timeout to 10 30575 1726867632.66691: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867632.66693: Set connection var ansible_connection to ssh 30575 1726867632.66695: variable 'ansible_shell_executable' from source: unknown 30575 1726867632.66696: variable 'ansible_connection' from source: unknown 30575 1726867632.66698: variable 'ansible_module_compression' from source: unknown 30575 1726867632.66827: variable 'ansible_shell_type' from source: unknown 30575 1726867632.66830: variable 'ansible_shell_executable' from source: unknown 30575 1726867632.66833: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867632.66835: variable 'ansible_pipelining' from source: unknown 30575 1726867632.66837: variable 'ansible_timeout' from source: unknown 30575 1726867632.66839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867632.67245: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867632.67254: variable 'omit' from source: magic vars 30575 1726867632.67260: starting attempt loop 30575 1726867632.67265: running the handler 30575 1726867632.67276: _low_level_execute_command(): starting 30575 1726867632.67286: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867632.68260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.68270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.68287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.68294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.68407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867632.68415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.68464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.70253: stdout chunk (state=3): >>>/root <<< 30575 1726867632.70325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867632.70336: stdout chunk (state=3): >>><<< 30575 1726867632.70378: stderr chunk (state=3): >>><<< 30575 1726867632.70593: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867632.70596: _low_level_execute_command(): starting 30575 1726867632.70599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596 `" && echo ansible-tmp-1726867632.7040765-33787-279906871535596="` echo /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596 `" ) && sleep 0' 30575 1726867632.71275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867632.71294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.71315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.71335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867632.71353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867632.71394: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.71483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867632.71506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.71589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.73505: stdout chunk (state=3): >>>ansible-tmp-1726867632.7040765-33787-279906871535596=/root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596 <<< 30575 1726867632.73732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867632.73735: stdout chunk (state=3): >>><<< 30575 1726867632.73737: stderr chunk (state=3): >>><<< 30575 1726867632.73983: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867632.7040765-33787-279906871535596=/root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867632.73987: variable 'ansible_module_compression' from source: unknown 30575 1726867632.73989: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867632.73991: variable 'ansible_facts' from source: unknown 30575 1726867632.74284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/AnsiballZ_ping.py 30575 1726867632.74614: Sending initial data 30575 1726867632.74684: Sent initial data (153 bytes) 30575 1726867632.75757: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867632.75775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.75797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867632.75891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.75933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867632.75951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867632.75971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.76048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.77615: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867632.77683: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867632.77759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpp6xf9kof /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/AnsiballZ_ping.py <<< 30575 1726867632.77762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/AnsiballZ_ping.py" <<< 30575 1726867632.77806: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpp6xf9kof" to remote "/root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/AnsiballZ_ping.py" <<< 30575 1726867632.78558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867632.78658: stderr chunk (state=3): >>><<< 30575 1726867632.78661: stdout chunk (state=3): >>><<< 30575 1726867632.78888: done transferring module to remote 30575 1726867632.78972: _low_level_execute_command(): starting 30575 1726867632.78976: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/ /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/AnsiballZ_ping.py && sleep 0' 30575 1726867632.80141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867632.80156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.80236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.82082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867632.82086: stdout chunk (state=3): >>><<< 30575 1726867632.82088: stderr chunk (state=3): >>><<< 30575 1726867632.82091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867632.82093: _low_level_execute_command(): starting 30575 1726867632.82095: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/AnsiballZ_ping.py && sleep 0' 30575 1726867632.82699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.82774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867632.82796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.82867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867632.98030: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867632.99233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867632.99257: stderr chunk (state=3): >>><<< 30575 1726867632.99261: stdout chunk (state=3): >>><<< 30575 1726867632.99274: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867632.99299: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867632.99306: _low_level_execute_command(): starting 30575 1726867632.99313: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867632.7040765-33787-279906871535596/ > /dev/null 2>&1 && sleep 0' 30575 1726867632.99711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867632.99716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.99732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867632.99783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867632.99787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867632.99837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.01782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.01786: stdout chunk (state=3): >>><<< 30575 1726867633.01788: stderr chunk (state=3): >>><<< 30575 1726867633.01791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.01797: handler run complete 30575 1726867633.01799: attempt loop complete, returning result 30575 1726867633.01801: _execute() done 30575 1726867633.01803: dumping result to json 30575 1726867633.01805: done dumping result, returning 30575 1726867633.01807: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-000000001474] 30575 1726867633.01810: sending task result for task 0affcac9-a3a5-e081-a588-000000001474 30575 1726867633.01884: done sending task result for task 0affcac9-a3a5-e081-a588-000000001474 30575 1726867633.01887: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867633.02022: no more pending results, returning what we have 30575 1726867633.02026: results queue empty 30575 1726867633.02027: checking for any_errors_fatal 30575 1726867633.02035: done checking for any_errors_fatal 30575 1726867633.02036: checking for max_fail_percentage 30575 1726867633.02037: done checking for max_fail_percentage 30575 1726867633.02038: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.02039: done checking to see if all hosts have failed 30575 1726867633.02040: getting the remaining hosts for this loop 30575 1726867633.02042: done getting the remaining hosts for this loop 30575 1726867633.02045: getting the next task for host managed_node3 30575 1726867633.02330: done getting next task for host managed_node3 30575 1726867633.02334: ^ task is: TASK: meta (role_complete) 30575 1726867633.02339: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.02352: getting variables 30575 1726867633.02354: in VariableManager get_vars() 30575 1726867633.02702: Calling all_inventory to load vars for managed_node3 30575 1726867633.02705: Calling groups_inventory to load vars for managed_node3 30575 1726867633.02707: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.02717: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.02720: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.02723: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.05730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.08871: done with get_vars() 30575 1726867633.08908: done getting variables 30575 1726867633.09119: done queuing things up, now waiting for results queue to drain 30575 1726867633.09121: results queue empty 30575 1726867633.09122: checking for any_errors_fatal 30575 1726867633.09125: done checking for any_errors_fatal 30575 1726867633.09126: checking for max_fail_percentage 30575 1726867633.09127: done checking for max_fail_percentage 30575 1726867633.09128: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.09129: done checking to see if all hosts have failed 30575 1726867633.09130: getting the remaining hosts for this loop 30575 1726867633.09131: done getting the remaining hosts for this loop 30575 1726867633.09133: getting the next task for host managed_node3 30575 1726867633.09139: done getting next task for host managed_node3 30575 1726867633.09142: ^ task is: TASK: Asserts 30575 1726867633.09144: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.09148: getting variables 30575 1726867633.09149: in VariableManager get_vars() 30575 1726867633.09161: Calling all_inventory to load vars for managed_node3 30575 1726867633.09163: Calling groups_inventory to load vars for managed_node3 30575 1726867633.09165: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.09170: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.09173: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.09175: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.11165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.12872: done with get_vars() 30575 1726867633.12895: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:27:13 -0400 (0:00:00.479) 0:01:08.507 ****** 30575 1726867633.12968: entering _queue_task() for managed_node3/include_tasks 30575 1726867633.13350: worker is 1 (out of 1 available) 30575 1726867633.13362: exiting _queue_task() for managed_node3/include_tasks 30575 1726867633.13516: done queuing things up, now waiting for results queue to drain 30575 1726867633.13519: waiting for pending results... 30575 1726867633.13752: running TaskExecutor() for managed_node3/TASK: Asserts 30575 1726867633.13847: in run() - task 0affcac9-a3a5-e081-a588-00000000100a 30575 1726867633.13851: variable 'ansible_search_path' from source: unknown 30575 1726867633.13854: variable 'ansible_search_path' from source: unknown 30575 1726867633.13892: variable 'lsr_assert' from source: include params 30575 1726867633.14182: variable 'lsr_assert' from source: include params 30575 1726867633.14188: variable 'omit' from source: magic vars 30575 1726867633.14336: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.14346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.14356: variable 'omit' from source: magic vars 30575 1726867633.14716: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.14719: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.14722: variable 'item' from source: unknown 30575 1726867633.14724: variable 'item' from source: unknown 30575 1726867633.14726: variable 'item' from source: unknown 30575 1726867633.14790: variable 'item' from source: unknown 30575 1726867633.14929: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.14933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.14936: variable 'omit' from source: magic vars 30575 1726867633.15159: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.15162: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.15165: variable 'item' from source: unknown 30575 1726867633.15167: variable 'item' from source: unknown 30575 1726867633.15169: variable 'item' from source: unknown 30575 1726867633.15339: variable 'item' from source: unknown 30575 1726867633.15389: dumping result to json 30575 1726867633.15392: done dumping result, returning 30575 1726867633.15394: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-e081-a588-00000000100a] 30575 1726867633.15396: sending task result for task 0affcac9-a3a5-e081-a588-00000000100a 30575 1726867633.15436: done sending task result for task 0affcac9-a3a5-e081-a588-00000000100a 30575 1726867633.15439: WORKER PROCESS EXITING 30575 1726867633.15518: no more pending results, returning what we have 30575 1726867633.15523: in VariableManager get_vars() 30575 1726867633.15562: Calling all_inventory to load vars for managed_node3 30575 1726867633.15565: Calling groups_inventory to load vars for managed_node3 30575 1726867633.15568: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.15682: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.15687: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.15691: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.16630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.17474: done with get_vars() 30575 1726867633.17490: variable 'ansible_search_path' from source: unknown 30575 1726867633.17491: variable 'ansible_search_path' from source: unknown 30575 1726867633.17517: variable 'ansible_search_path' from source: unknown 30575 1726867633.17518: variable 'ansible_search_path' from source: unknown 30575 1726867633.17535: we have included files to process 30575 1726867633.17536: generating all_blocks data 30575 1726867633.17538: done generating all_blocks data 30575 1726867633.17542: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867633.17542: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867633.17544: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30575 1726867633.17613: in VariableManager get_vars() 30575 1726867633.17628: done with get_vars() 30575 1726867633.17703: done processing included file 30575 1726867633.17705: iterating over new_blocks loaded from include file 30575 1726867633.17706: in VariableManager get_vars() 30575 1726867633.17716: done with get_vars() 30575 1726867633.17717: filtering new block on tags 30575 1726867633.17740: done filtering new block on tags 30575 1726867633.17742: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 => (item=tasks/assert_device_present.yml) 30575 1726867633.17745: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867633.17746: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867633.17748: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867633.17846: in VariableManager get_vars() 30575 1726867633.17862: done with get_vars() 30575 1726867633.17943: done processing included file 30575 1726867633.17945: iterating over new_blocks loaded from include file 30575 1726867633.17946: in VariableManager get_vars() 30575 1726867633.17959: done with get_vars() 30575 1726867633.17961: filtering new block on tags 30575 1726867633.17992: done filtering new block on tags 30575 1726867633.17994: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=tasks/assert_profile_absent.yml) 30575 1726867633.17998: extending task lists for all hosts with included blocks 30575 1726867633.18944: done extending task lists 30575 1726867633.18945: done processing included files 30575 1726867633.18945: results queue empty 30575 1726867633.18946: checking for any_errors_fatal 30575 1726867633.18947: done checking for any_errors_fatal 30575 1726867633.18947: checking for max_fail_percentage 30575 1726867633.18948: done checking for max_fail_percentage 30575 1726867633.18948: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.18949: done checking to see if all hosts have failed 30575 1726867633.18949: getting the remaining hosts for this loop 30575 1726867633.18950: done getting the remaining hosts for this loop 30575 1726867633.18952: getting the next task for host managed_node3 30575 1726867633.18955: done getting next task for host managed_node3 30575 1726867633.18956: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867633.18958: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.18963: getting variables 30575 1726867633.18964: in VariableManager get_vars() 30575 1726867633.18971: Calling all_inventory to load vars for managed_node3 30575 1726867633.18972: Calling groups_inventory to load vars for managed_node3 30575 1726867633.18973: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.18979: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.18981: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.18984: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.19635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.20458: done with get_vars() 30575 1726867633.20472: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:27:13 -0400 (0:00:00.075) 0:01:08.582 ****** 30575 1726867633.20524: entering _queue_task() for managed_node3/include_tasks 30575 1726867633.20749: worker is 1 (out of 1 available) 30575 1726867633.20762: exiting _queue_task() for managed_node3/include_tasks 30575 1726867633.20775: done queuing things up, now waiting for results queue to drain 30575 1726867633.20779: waiting for pending results... 30575 1726867633.21064: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867633.21297: in run() - task 0affcac9-a3a5-e081-a588-0000000015cf 30575 1726867633.21301: variable 'ansible_search_path' from source: unknown 30575 1726867633.21304: variable 'ansible_search_path' from source: unknown 30575 1726867633.21307: calling self._execute() 30575 1726867633.21309: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.21314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.21317: variable 'omit' from source: magic vars 30575 1726867633.21637: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.21667: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.21681: _execute() done 30575 1726867633.21691: dumping result to json 30575 1726867633.21700: done dumping result, returning 30575 1726867633.21714: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-0000000015cf] 30575 1726867633.21725: sending task result for task 0affcac9-a3a5-e081-a588-0000000015cf 30575 1726867633.21892: no more pending results, returning what we have 30575 1726867633.21897: in VariableManager get_vars() 30575 1726867633.21945: Calling all_inventory to load vars for managed_node3 30575 1726867633.21948: Calling groups_inventory to load vars for managed_node3 30575 1726867633.21951: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.21963: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.21966: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.21969: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.22716: done sending task result for task 0affcac9-a3a5-e081-a588-0000000015cf 30575 1726867633.22719: WORKER PROCESS EXITING 30575 1726867633.23021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.23873: done with get_vars() 30575 1726867633.23887: variable 'ansible_search_path' from source: unknown 30575 1726867633.23888: variable 'ansible_search_path' from source: unknown 30575 1726867633.23893: variable 'item' from source: include params 30575 1726867633.23970: variable 'item' from source: include params 30575 1726867633.23994: we have included files to process 30575 1726867633.23995: generating all_blocks data 30575 1726867633.23996: done generating all_blocks data 30575 1726867633.23997: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867633.23998: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867633.23999: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867633.24183: done processing included file 30575 1726867633.24185: iterating over new_blocks loaded from include file 30575 1726867633.24186: in VariableManager get_vars() 30575 1726867633.24204: done with get_vars() 30575 1726867633.24206: filtering new block on tags 30575 1726867633.24236: done filtering new block on tags 30575 1726867633.24239: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867633.24244: extending task lists for all hosts with included blocks 30575 1726867633.24395: done extending task lists 30575 1726867633.24396: done processing included files 30575 1726867633.24397: results queue empty 30575 1726867633.24398: checking for any_errors_fatal 30575 1726867633.24401: done checking for any_errors_fatal 30575 1726867633.24402: checking for max_fail_percentage 30575 1726867633.24403: done checking for max_fail_percentage 30575 1726867633.24406: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.24407: done checking to see if all hosts have failed 30575 1726867633.24407: getting the remaining hosts for this loop 30575 1726867633.24408: done getting the remaining hosts for this loop 30575 1726867633.24412: getting the next task for host managed_node3 30575 1726867633.24416: done getting next task for host managed_node3 30575 1726867633.24418: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867633.24421: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.24423: getting variables 30575 1726867633.24424: in VariableManager get_vars() 30575 1726867633.24432: Calling all_inventory to load vars for managed_node3 30575 1726867633.24434: Calling groups_inventory to load vars for managed_node3 30575 1726867633.24436: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.24440: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.24442: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.24445: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.29445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.30294: done with get_vars() 30575 1726867633.30310: done getting variables 30575 1726867633.30401: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:27:13 -0400 (0:00:00.098) 0:01:08.681 ****** 30575 1726867633.30423: entering _queue_task() for managed_node3/stat 30575 1726867633.30695: worker is 1 (out of 1 available) 30575 1726867633.30710: exiting _queue_task() for managed_node3/stat 30575 1726867633.30725: done queuing things up, now waiting for results queue to drain 30575 1726867633.30727: waiting for pending results... 30575 1726867633.30908: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867633.31007: in run() - task 0affcac9-a3a5-e081-a588-000000001647 30575 1726867633.31019: variable 'ansible_search_path' from source: unknown 30575 1726867633.31022: variable 'ansible_search_path' from source: unknown 30575 1726867633.31051: calling self._execute() 30575 1726867633.31124: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.31128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.31136: variable 'omit' from source: magic vars 30575 1726867633.31415: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.31422: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.31428: variable 'omit' from source: magic vars 30575 1726867633.31463: variable 'omit' from source: magic vars 30575 1726867633.31534: variable 'interface' from source: play vars 30575 1726867633.31548: variable 'omit' from source: magic vars 30575 1726867633.31582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867633.31610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867633.31627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867633.31641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.31651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.31675: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867633.31679: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.31682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.31752: Set connection var ansible_pipelining to False 30575 1726867633.31756: Set connection var ansible_shell_type to sh 30575 1726867633.31759: Set connection var ansible_shell_executable to /bin/sh 30575 1726867633.31765: Set connection var ansible_timeout to 10 30575 1726867633.31770: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867633.31776: Set connection var ansible_connection to ssh 30575 1726867633.31795: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.31799: variable 'ansible_connection' from source: unknown 30575 1726867633.31801: variable 'ansible_module_compression' from source: unknown 30575 1726867633.31803: variable 'ansible_shell_type' from source: unknown 30575 1726867633.31806: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.31808: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.31810: variable 'ansible_pipelining' from source: unknown 30575 1726867633.31816: variable 'ansible_timeout' from source: unknown 30575 1726867633.31819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.31964: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867633.31973: variable 'omit' from source: magic vars 30575 1726867633.31981: starting attempt loop 30575 1726867633.31984: running the handler 30575 1726867633.31994: _low_level_execute_command(): starting 30575 1726867633.32001: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867633.32523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.32527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867633.32530: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867633.32533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.32581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.32585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.32587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.32646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.34323: stdout chunk (state=3): >>>/root <<< 30575 1726867633.34423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.34462: stderr chunk (state=3): >>><<< 30575 1726867633.34465: stdout chunk (state=3): >>><<< 30575 1726867633.34483: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.34496: _low_level_execute_command(): starting 30575 1726867633.34501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426 `" && echo ansible-tmp-1726867633.3448317-33827-231820325297426="` echo /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426 `" ) && sleep 0' 30575 1726867633.34936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867633.34939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867633.34941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.34951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867633.34954: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.34956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.34992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.34995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.35014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.35066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.36960: stdout chunk (state=3): >>>ansible-tmp-1726867633.3448317-33827-231820325297426=/root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426 <<< 30575 1726867633.37066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.37094: stderr chunk (state=3): >>><<< 30575 1726867633.37098: stdout chunk (state=3): >>><<< 30575 1726867633.37113: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867633.3448317-33827-231820325297426=/root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.37158: variable 'ansible_module_compression' from source: unknown 30575 1726867633.37203: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867633.37238: variable 'ansible_facts' from source: unknown 30575 1726867633.37298: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/AnsiballZ_stat.py 30575 1726867633.37398: Sending initial data 30575 1726867633.37402: Sent initial data (153 bytes) 30575 1726867633.37850: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867633.37853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867633.37856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.37859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.37862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.37914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.37921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.37967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.39515: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867633.39523: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867633.39556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867633.39601: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4tpu68io /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/AnsiballZ_stat.py <<< 30575 1726867633.39610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/AnsiballZ_stat.py" <<< 30575 1726867633.39645: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4tpu68io" to remote "/root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/AnsiballZ_stat.py" <<< 30575 1726867633.39651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/AnsiballZ_stat.py" <<< 30575 1726867633.40192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.40231: stderr chunk (state=3): >>><<< 30575 1726867633.40234: stdout chunk (state=3): >>><<< 30575 1726867633.40271: done transferring module to remote 30575 1726867633.40280: _low_level_execute_command(): starting 30575 1726867633.40286: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/ /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/AnsiballZ_stat.py && sleep 0' 30575 1726867633.40717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.40722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867633.40724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.40726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867633.40732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.40772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.40775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.40825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.42574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.42598: stderr chunk (state=3): >>><<< 30575 1726867633.42601: stdout chunk (state=3): >>><<< 30575 1726867633.42615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.42618: _low_level_execute_command(): starting 30575 1726867633.42623: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/AnsiballZ_stat.py && sleep 0' 30575 1726867633.43042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867633.43045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.43048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867633.43050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867633.43052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.43100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.43103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.43155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.58551: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31444, "dev": 23, "nlink": 1, "atime": 1726867619.0444045, "mtime": 1726867619.0444045, "ctime": 1726867619.0444045, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867633.60147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867633.60151: stdout chunk (state=3): >>><<< 30575 1726867633.60154: stderr chunk (state=3): >>><<< 30575 1726867633.60161: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31444, "dev": 23, "nlink": 1, "atime": 1726867619.0444045, "mtime": 1726867619.0444045, "ctime": 1726867619.0444045, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867633.60166: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867633.60189: _low_level_execute_command(): starting 30575 1726867633.60192: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867633.3448317-33827-231820325297426/ > /dev/null 2>&1 && sleep 0' 30575 1726867633.61308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867633.61353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867633.61368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.61389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867633.61494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.61509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.61535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.61600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.63671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.63684: stdout chunk (state=3): >>><<< 30575 1726867633.63696: stderr chunk (state=3): >>><<< 30575 1726867633.63718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.63854: handler run complete 30575 1726867633.63857: attempt loop complete, returning result 30575 1726867633.63860: _execute() done 30575 1726867633.63891: dumping result to json 30575 1726867633.63903: done dumping result, returning 30575 1726867633.64083: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-000000001647] 30575 1726867633.64086: sending task result for task 0affcac9-a3a5-e081-a588-000000001647 30575 1726867633.64166: done sending task result for task 0affcac9-a3a5-e081-a588-000000001647 30575 1726867633.64170: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867619.0444045, "block_size": 4096, "blocks": 0, "ctime": 1726867619.0444045, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31444, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726867619.0444045, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30575 1726867633.64269: no more pending results, returning what we have 30575 1726867633.64273: results queue empty 30575 1726867633.64274: checking for any_errors_fatal 30575 1726867633.64275: done checking for any_errors_fatal 30575 1726867633.64276: checking for max_fail_percentage 30575 1726867633.64279: done checking for max_fail_percentage 30575 1726867633.64280: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.64281: done checking to see if all hosts have failed 30575 1726867633.64282: getting the remaining hosts for this loop 30575 1726867633.64283: done getting the remaining hosts for this loop 30575 1726867633.64287: getting the next task for host managed_node3 30575 1726867633.64297: done getting next task for host managed_node3 30575 1726867633.64301: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30575 1726867633.64304: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.64310: getting variables 30575 1726867633.64311: in VariableManager get_vars() 30575 1726867633.64348: Calling all_inventory to load vars for managed_node3 30575 1726867633.64350: Calling groups_inventory to load vars for managed_node3 30575 1726867633.64353: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.64364: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.64366: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.64369: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.66283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.67451: done with get_vars() 30575 1726867633.67472: done getting variables 30575 1726867633.67555: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867633.67689: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:27:13 -0400 (0:00:00.372) 0:01:09.054 ****** 30575 1726867633.67726: entering _queue_task() for managed_node3/assert 30575 1726867633.68303: worker is 1 (out of 1 available) 30575 1726867633.68319: exiting _queue_task() for managed_node3/assert 30575 1726867633.68333: done queuing things up, now waiting for results queue to drain 30575 1726867633.68335: waiting for pending results... 30575 1726867633.68721: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' 30575 1726867633.68751: in run() - task 0affcac9-a3a5-e081-a588-0000000015d0 30575 1726867633.68765: variable 'ansible_search_path' from source: unknown 30575 1726867633.68769: variable 'ansible_search_path' from source: unknown 30575 1726867633.68806: calling self._execute() 30575 1726867633.68954: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.68980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.68990: variable 'omit' from source: magic vars 30575 1726867633.69319: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.69328: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.69334: variable 'omit' from source: magic vars 30575 1726867633.69365: variable 'omit' from source: magic vars 30575 1726867633.69435: variable 'interface' from source: play vars 30575 1726867633.69449: variable 'omit' from source: magic vars 30575 1726867633.69484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867633.69514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867633.69527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867633.69541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.69551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.69574: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867633.69579: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.69582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.69651: Set connection var ansible_pipelining to False 30575 1726867633.69655: Set connection var ansible_shell_type to sh 30575 1726867633.69659: Set connection var ansible_shell_executable to /bin/sh 30575 1726867633.69665: Set connection var ansible_timeout to 10 30575 1726867633.69670: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867633.69676: Set connection var ansible_connection to ssh 30575 1726867633.69697: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.69700: variable 'ansible_connection' from source: unknown 30575 1726867633.69704: variable 'ansible_module_compression' from source: unknown 30575 1726867633.69707: variable 'ansible_shell_type' from source: unknown 30575 1726867633.69709: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.69714: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.69717: variable 'ansible_pipelining' from source: unknown 30575 1726867633.69719: variable 'ansible_timeout' from source: unknown 30575 1726867633.69722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.69818: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867633.69827: variable 'omit' from source: magic vars 30575 1726867633.69839: starting attempt loop 30575 1726867633.69843: running the handler 30575 1726867633.69927: variable 'interface_stat' from source: set_fact 30575 1726867633.69942: Evaluated conditional (interface_stat.stat.exists): True 30575 1726867633.69945: handler run complete 30575 1726867633.69959: attempt loop complete, returning result 30575 1726867633.69963: _execute() done 30575 1726867633.69965: dumping result to json 30575 1726867633.69968: done dumping result, returning 30575 1726867633.69973: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' [0affcac9-a3a5-e081-a588-0000000015d0] 30575 1726867633.69979: sending task result for task 0affcac9-a3a5-e081-a588-0000000015d0 30575 1726867633.70136: done sending task result for task 0affcac9-a3a5-e081-a588-0000000015d0 30575 1726867633.70139: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867633.70216: no more pending results, returning what we have 30575 1726867633.70220: results queue empty 30575 1726867633.70221: checking for any_errors_fatal 30575 1726867633.70228: done checking for any_errors_fatal 30575 1726867633.70229: checking for max_fail_percentage 30575 1726867633.70231: done checking for max_fail_percentage 30575 1726867633.70232: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.70232: done checking to see if all hosts have failed 30575 1726867633.70233: getting the remaining hosts for this loop 30575 1726867633.70235: done getting the remaining hosts for this loop 30575 1726867633.70238: getting the next task for host managed_node3 30575 1726867633.70247: done getting next task for host managed_node3 30575 1726867633.70249: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30575 1726867633.70253: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.70256: getting variables 30575 1726867633.70258: in VariableManager get_vars() 30575 1726867633.70290: Calling all_inventory to load vars for managed_node3 30575 1726867633.70292: Calling groups_inventory to load vars for managed_node3 30575 1726867633.70295: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.70304: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.70307: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.70309: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.71908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.73424: done with get_vars() 30575 1726867633.73438: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 17:27:13 -0400 (0:00:00.057) 0:01:09.112 ****** 30575 1726867633.73504: entering _queue_task() for managed_node3/include_tasks 30575 1726867633.73724: worker is 1 (out of 1 available) 30575 1726867633.73737: exiting _queue_task() for managed_node3/include_tasks 30575 1726867633.73750: done queuing things up, now waiting for results queue to drain 30575 1726867633.73752: waiting for pending results... 30575 1726867633.73923: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30575 1726867633.74008: in run() - task 0affcac9-a3a5-e081-a588-0000000015d4 30575 1726867633.74020: variable 'ansible_search_path' from source: unknown 30575 1726867633.74024: variable 'ansible_search_path' from source: unknown 30575 1726867633.74051: calling self._execute() 30575 1726867633.74129: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.74134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.74141: variable 'omit' from source: magic vars 30575 1726867633.74416: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.74420: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.74424: _execute() done 30575 1726867633.74427: dumping result to json 30575 1726867633.74430: done dumping result, returning 30575 1726867633.74439: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-e081-a588-0000000015d4] 30575 1726867633.74444: sending task result for task 0affcac9-a3a5-e081-a588-0000000015d4 30575 1726867633.74531: done sending task result for task 0affcac9-a3a5-e081-a588-0000000015d4 30575 1726867633.74534: WORKER PROCESS EXITING 30575 1726867633.74559: no more pending results, returning what we have 30575 1726867633.74564: in VariableManager get_vars() 30575 1726867633.74607: Calling all_inventory to load vars for managed_node3 30575 1726867633.74610: Calling groups_inventory to load vars for managed_node3 30575 1726867633.74616: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.74626: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.74629: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.74631: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.76722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.77709: done with get_vars() 30575 1726867633.77723: variable 'ansible_search_path' from source: unknown 30575 1726867633.77724: variable 'ansible_search_path' from source: unknown 30575 1726867633.77731: variable 'item' from source: include params 30575 1726867633.77803: variable 'item' from source: include params 30575 1726867633.77827: we have included files to process 30575 1726867633.77828: generating all_blocks data 30575 1726867633.77830: done generating all_blocks data 30575 1726867633.77834: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867633.77835: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867633.77837: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867633.78433: done processing included file 30575 1726867633.78434: iterating over new_blocks loaded from include file 30575 1726867633.78435: in VariableManager get_vars() 30575 1726867633.78445: done with get_vars() 30575 1726867633.78447: filtering new block on tags 30575 1726867633.78491: done filtering new block on tags 30575 1726867633.78494: in VariableManager get_vars() 30575 1726867633.78503: done with get_vars() 30575 1726867633.78504: filtering new block on tags 30575 1726867633.78536: done filtering new block on tags 30575 1726867633.78538: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30575 1726867633.78542: extending task lists for all hosts with included blocks 30575 1726867633.78715: done extending task lists 30575 1726867633.78716: done processing included files 30575 1726867633.78717: results queue empty 30575 1726867633.78718: checking for any_errors_fatal 30575 1726867633.78720: done checking for any_errors_fatal 30575 1726867633.78721: checking for max_fail_percentage 30575 1726867633.78722: done checking for max_fail_percentage 30575 1726867633.78723: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.78724: done checking to see if all hosts have failed 30575 1726867633.78724: getting the remaining hosts for this loop 30575 1726867633.78726: done getting the remaining hosts for this loop 30575 1726867633.78728: getting the next task for host managed_node3 30575 1726867633.78732: done getting next task for host managed_node3 30575 1726867633.78734: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867633.78737: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.78740: getting variables 30575 1726867633.78741: in VariableManager get_vars() 30575 1726867633.78750: Calling all_inventory to load vars for managed_node3 30575 1726867633.78752: Calling groups_inventory to load vars for managed_node3 30575 1726867633.78754: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.78760: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.78762: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.78765: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.79771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.80665: done with get_vars() 30575 1726867633.80681: done getting variables 30575 1726867633.80709: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:27:13 -0400 (0:00:00.072) 0:01:09.184 ****** 30575 1726867633.80733: entering _queue_task() for managed_node3/set_fact 30575 1726867633.80964: worker is 1 (out of 1 available) 30575 1726867633.80976: exiting _queue_task() for managed_node3/set_fact 30575 1726867633.80991: done queuing things up, now waiting for results queue to drain 30575 1726867633.80992: waiting for pending results... 30575 1726867633.81167: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867633.81256: in run() - task 0affcac9-a3a5-e081-a588-000000001665 30575 1726867633.81270: variable 'ansible_search_path' from source: unknown 30575 1726867633.81273: variable 'ansible_search_path' from source: unknown 30575 1726867633.81301: calling self._execute() 30575 1726867633.81370: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.81375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.81385: variable 'omit' from source: magic vars 30575 1726867633.81657: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.81667: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.81671: variable 'omit' from source: magic vars 30575 1726867633.81707: variable 'omit' from source: magic vars 30575 1726867633.81732: variable 'omit' from source: magic vars 30575 1726867633.81763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867633.81795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867633.81810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867633.81826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.81836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.81860: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867633.81864: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.81867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.81940: Set connection var ansible_pipelining to False 30575 1726867633.81944: Set connection var ansible_shell_type to sh 30575 1726867633.81947: Set connection var ansible_shell_executable to /bin/sh 30575 1726867633.81953: Set connection var ansible_timeout to 10 30575 1726867633.81958: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867633.81964: Set connection var ansible_connection to ssh 30575 1726867633.81984: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.81989: variable 'ansible_connection' from source: unknown 30575 1726867633.81992: variable 'ansible_module_compression' from source: unknown 30575 1726867633.81995: variable 'ansible_shell_type' from source: unknown 30575 1726867633.81997: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.82000: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.82002: variable 'ansible_pipelining' from source: unknown 30575 1726867633.82004: variable 'ansible_timeout' from source: unknown 30575 1726867633.82007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.82103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867633.82113: variable 'omit' from source: magic vars 30575 1726867633.82125: starting attempt loop 30575 1726867633.82129: running the handler 30575 1726867633.82137: handler run complete 30575 1726867633.82146: attempt loop complete, returning result 30575 1726867633.82149: _execute() done 30575 1726867633.82152: dumping result to json 30575 1726867633.82154: done dumping result, returning 30575 1726867633.82159: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-e081-a588-000000001665] 30575 1726867633.82165: sending task result for task 0affcac9-a3a5-e081-a588-000000001665 30575 1726867633.82242: done sending task result for task 0affcac9-a3a5-e081-a588-000000001665 30575 1726867633.82245: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30575 1726867633.82298: no more pending results, returning what we have 30575 1726867633.82302: results queue empty 30575 1726867633.82303: checking for any_errors_fatal 30575 1726867633.82304: done checking for any_errors_fatal 30575 1726867633.82305: checking for max_fail_percentage 30575 1726867633.82306: done checking for max_fail_percentage 30575 1726867633.82307: checking to see if all hosts have failed and the running result is not ok 30575 1726867633.82308: done checking to see if all hosts have failed 30575 1726867633.82308: getting the remaining hosts for this loop 30575 1726867633.82310: done getting the remaining hosts for this loop 30575 1726867633.82313: getting the next task for host managed_node3 30575 1726867633.82322: done getting next task for host managed_node3 30575 1726867633.82324: ^ task is: TASK: Stat profile file 30575 1726867633.82328: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867633.82331: getting variables 30575 1726867633.82333: in VariableManager get_vars() 30575 1726867633.82368: Calling all_inventory to load vars for managed_node3 30575 1726867633.82370: Calling groups_inventory to load vars for managed_node3 30575 1726867633.82373: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867633.82391: Calling all_plugins_play to load vars for managed_node3 30575 1726867633.82394: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867633.82397: Calling groups_plugins_play to load vars for managed_node3 30575 1726867633.83146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867633.84005: done with get_vars() 30575 1726867633.84021: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:27:13 -0400 (0:00:00.033) 0:01:09.218 ****** 30575 1726867633.84083: entering _queue_task() for managed_node3/stat 30575 1726867633.84281: worker is 1 (out of 1 available) 30575 1726867633.84293: exiting _queue_task() for managed_node3/stat 30575 1726867633.84308: done queuing things up, now waiting for results queue to drain 30575 1726867633.84309: waiting for pending results... 30575 1726867633.84472: running TaskExecutor() for managed_node3/TASK: Stat profile file 30575 1726867633.84567: in run() - task 0affcac9-a3a5-e081-a588-000000001666 30575 1726867633.84580: variable 'ansible_search_path' from source: unknown 30575 1726867633.84584: variable 'ansible_search_path' from source: unknown 30575 1726867633.84617: calling self._execute() 30575 1726867633.84687: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.84691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.84700: variable 'omit' from source: magic vars 30575 1726867633.84967: variable 'ansible_distribution_major_version' from source: facts 30575 1726867633.84975: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867633.84983: variable 'omit' from source: magic vars 30575 1726867633.85021: variable 'omit' from source: magic vars 30575 1726867633.85087: variable 'profile' from source: play vars 30575 1726867633.85091: variable 'interface' from source: play vars 30575 1726867633.85139: variable 'interface' from source: play vars 30575 1726867633.85153: variable 'omit' from source: magic vars 30575 1726867633.85187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867633.85217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867633.85230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867633.85243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.85253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867633.85278: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867633.85282: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.85284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.85352: Set connection var ansible_pipelining to False 30575 1726867633.85355: Set connection var ansible_shell_type to sh 30575 1726867633.85360: Set connection var ansible_shell_executable to /bin/sh 30575 1726867633.85365: Set connection var ansible_timeout to 10 30575 1726867633.85370: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867633.85376: Set connection var ansible_connection to ssh 30575 1726867633.85394: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.85397: variable 'ansible_connection' from source: unknown 30575 1726867633.85400: variable 'ansible_module_compression' from source: unknown 30575 1726867633.85402: variable 'ansible_shell_type' from source: unknown 30575 1726867633.85406: variable 'ansible_shell_executable' from source: unknown 30575 1726867633.85408: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867633.85413: variable 'ansible_pipelining' from source: unknown 30575 1726867633.85416: variable 'ansible_timeout' from source: unknown 30575 1726867633.85418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867633.85555: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867633.85563: variable 'omit' from source: magic vars 30575 1726867633.85569: starting attempt loop 30575 1726867633.85571: running the handler 30575 1726867633.85584: _low_level_execute_command(): starting 30575 1726867633.85591: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867633.86062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.86075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.86105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867633.86109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.86157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.86160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.86167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.86226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.87909: stdout chunk (state=3): >>>/root <<< 30575 1726867633.88005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.88030: stderr chunk (state=3): >>><<< 30575 1726867633.88034: stdout chunk (state=3): >>><<< 30575 1726867633.88053: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.88064: _low_level_execute_command(): starting 30575 1726867633.88069: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731 `" && echo ansible-tmp-1726867633.880527-33857-17472943208731="` echo /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731 `" ) && sleep 0' 30575 1726867633.88452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.88456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867633.88489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867633.88499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.88502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867633.88504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867633.88513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.88561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.88568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.88570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.88615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.90503: stdout chunk (state=3): >>>ansible-tmp-1726867633.880527-33857-17472943208731=/root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731 <<< 30575 1726867633.90615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.90634: stderr chunk (state=3): >>><<< 30575 1726867633.90638: stdout chunk (state=3): >>><<< 30575 1726867633.90653: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867633.880527-33857-17472943208731=/root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.90687: variable 'ansible_module_compression' from source: unknown 30575 1726867633.90730: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867633.90762: variable 'ansible_facts' from source: unknown 30575 1726867633.90824: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/AnsiballZ_stat.py 30575 1726867633.90919: Sending initial data 30575 1726867633.90923: Sent initial data (151 bytes) 30575 1726867633.91338: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867633.91341: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867633.91344: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867633.91346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.91393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.91400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.91443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.92979: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867633.92982: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867633.93024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867633.93068: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpp9vca6sq /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/AnsiballZ_stat.py <<< 30575 1726867633.93071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/AnsiballZ_stat.py" <<< 30575 1726867633.93107: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpp9vca6sq" to remote "/root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/AnsiballZ_stat.py" <<< 30575 1726867633.93638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.93671: stderr chunk (state=3): >>><<< 30575 1726867633.93675: stdout chunk (state=3): >>><<< 30575 1726867633.93712: done transferring module to remote 30575 1726867633.93722: _low_level_execute_command(): starting 30575 1726867633.93725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/ /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/AnsiballZ_stat.py && sleep 0' 30575 1726867633.94117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.94121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867633.94123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.94125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.94130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.94171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867633.94174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.94223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867633.95958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867633.95980: stderr chunk (state=3): >>><<< 30575 1726867633.95983: stdout chunk (state=3): >>><<< 30575 1726867633.95994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867633.95997: _low_level_execute_command(): starting 30575 1726867633.96001: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/AnsiballZ_stat.py && sleep 0' 30575 1726867633.96413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.96416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867633.96418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.96421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867633.96423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867633.96466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867633.96470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867633.96522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.11744: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867634.13240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867634.13244: stdout chunk (state=3): >>><<< 30575 1726867634.13258: stderr chunk (state=3): >>><<< 30575 1726867634.13269: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867634.13308: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867634.13371: _low_level_execute_command(): starting 30575 1726867634.13380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867633.880527-33857-17472943208731/ > /dev/null 2>&1 && sleep 0' 30575 1726867634.14516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867634.14973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867634.14979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867634.15050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.16982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867634.16985: stderr chunk (state=3): >>><<< 30575 1726867634.16987: stdout chunk (state=3): >>><<< 30575 1726867634.16990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867634.16992: handler run complete 30575 1726867634.16995: attempt loop complete, returning result 30575 1726867634.16997: _execute() done 30575 1726867634.16999: dumping result to json 30575 1726867634.17000: done dumping result, returning 30575 1726867634.17002: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-e081-a588-000000001666] 30575 1726867634.17004: sending task result for task 0affcac9-a3a5-e081-a588-000000001666 30575 1726867634.17098: done sending task result for task 0affcac9-a3a5-e081-a588-000000001666 30575 1726867634.17102: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867634.17161: no more pending results, returning what we have 30575 1726867634.17164: results queue empty 30575 1726867634.17165: checking for any_errors_fatal 30575 1726867634.17173: done checking for any_errors_fatal 30575 1726867634.17173: checking for max_fail_percentage 30575 1726867634.17175: done checking for max_fail_percentage 30575 1726867634.17176: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.17192: done checking to see if all hosts have failed 30575 1726867634.17194: getting the remaining hosts for this loop 30575 1726867634.17195: done getting the remaining hosts for this loop 30575 1726867634.17201: getting the next task for host managed_node3 30575 1726867634.17210: done getting next task for host managed_node3 30575 1726867634.17212: ^ task is: TASK: Set NM profile exist flag based on the profile files 30575 1726867634.17218: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.17223: getting variables 30575 1726867634.17225: in VariableManager get_vars() 30575 1726867634.17268: Calling all_inventory to load vars for managed_node3 30575 1726867634.17270: Calling groups_inventory to load vars for managed_node3 30575 1726867634.17275: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.17492: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.17495: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.17499: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.19385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.21016: done with get_vars() 30575 1726867634.21037: done getting variables 30575 1726867634.21101: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:27:14 -0400 (0:00:00.370) 0:01:09.588 ****** 30575 1726867634.21134: entering _queue_task() for managed_node3/set_fact 30575 1726867634.21466: worker is 1 (out of 1 available) 30575 1726867634.21584: exiting _queue_task() for managed_node3/set_fact 30575 1726867634.21596: done queuing things up, now waiting for results queue to drain 30575 1726867634.21597: waiting for pending results... 30575 1726867634.21894: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30575 1726867634.21917: in run() - task 0affcac9-a3a5-e081-a588-000000001667 30575 1726867634.21939: variable 'ansible_search_path' from source: unknown 30575 1726867634.21943: variable 'ansible_search_path' from source: unknown 30575 1726867634.21980: calling self._execute() 30575 1726867634.22081: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.22087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.22098: variable 'omit' from source: magic vars 30575 1726867634.22498: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.22510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.22641: variable 'profile_stat' from source: set_fact 30575 1726867634.22650: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867634.22653: when evaluation is False, skipping this task 30575 1726867634.22656: _execute() done 30575 1726867634.22658: dumping result to json 30575 1726867634.22681: done dumping result, returning 30575 1726867634.22685: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-e081-a588-000000001667] 30575 1726867634.22687: sending task result for task 0affcac9-a3a5-e081-a588-000000001667 30575 1726867634.22770: done sending task result for task 0affcac9-a3a5-e081-a588-000000001667 30575 1726867634.22773: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867634.22828: no more pending results, returning what we have 30575 1726867634.22832: results queue empty 30575 1726867634.22833: checking for any_errors_fatal 30575 1726867634.22844: done checking for any_errors_fatal 30575 1726867634.22845: checking for max_fail_percentage 30575 1726867634.22847: done checking for max_fail_percentage 30575 1726867634.22848: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.22849: done checking to see if all hosts have failed 30575 1726867634.22850: getting the remaining hosts for this loop 30575 1726867634.22851: done getting the remaining hosts for this loop 30575 1726867634.22855: getting the next task for host managed_node3 30575 1726867634.22865: done getting next task for host managed_node3 30575 1726867634.22868: ^ task is: TASK: Get NM profile info 30575 1726867634.22873: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.22879: getting variables 30575 1726867634.22882: in VariableManager get_vars() 30575 1726867634.23017: Calling all_inventory to load vars for managed_node3 30575 1726867634.23020: Calling groups_inventory to load vars for managed_node3 30575 1726867634.23024: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.23035: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.23038: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.23041: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.24494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.26819: done with get_vars() 30575 1726867634.26842: done getting variables 30575 1726867634.26911: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:27:14 -0400 (0:00:00.058) 0:01:09.646 ****** 30575 1726867634.26943: entering _queue_task() for managed_node3/shell 30575 1726867634.27237: worker is 1 (out of 1 available) 30575 1726867634.27249: exiting _queue_task() for managed_node3/shell 30575 1726867634.27263: done queuing things up, now waiting for results queue to drain 30575 1726867634.27264: waiting for pending results... 30575 1726867634.27693: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30575 1726867634.27698: in run() - task 0affcac9-a3a5-e081-a588-000000001668 30575 1726867634.27701: variable 'ansible_search_path' from source: unknown 30575 1726867634.27704: variable 'ansible_search_path' from source: unknown 30575 1726867634.27726: calling self._execute() 30575 1726867634.27818: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.27823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.27833: variable 'omit' from source: magic vars 30575 1726867634.28217: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.28227: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.28233: variable 'omit' from source: magic vars 30575 1726867634.28286: variable 'omit' from source: magic vars 30575 1726867634.28393: variable 'profile' from source: play vars 30575 1726867634.28397: variable 'interface' from source: play vars 30575 1726867634.28632: variable 'interface' from source: play vars 30575 1726867634.28635: variable 'omit' from source: magic vars 30575 1726867634.28638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867634.28641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867634.28659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867634.28673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867634.28686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867634.28983: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867634.28987: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.28989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.28992: Set connection var ansible_pipelining to False 30575 1726867634.28994: Set connection var ansible_shell_type to sh 30575 1726867634.28997: Set connection var ansible_shell_executable to /bin/sh 30575 1726867634.28999: Set connection var ansible_timeout to 10 30575 1726867634.29001: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867634.29003: Set connection var ansible_connection to ssh 30575 1726867634.29005: variable 'ansible_shell_executable' from source: unknown 30575 1726867634.29008: variable 'ansible_connection' from source: unknown 30575 1726867634.29010: variable 'ansible_module_compression' from source: unknown 30575 1726867634.29015: variable 'ansible_shell_type' from source: unknown 30575 1726867634.29017: variable 'ansible_shell_executable' from source: unknown 30575 1726867634.29019: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.29021: variable 'ansible_pipelining' from source: unknown 30575 1726867634.29023: variable 'ansible_timeout' from source: unknown 30575 1726867634.29025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.29028: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867634.29030: variable 'omit' from source: magic vars 30575 1726867634.29276: starting attempt loop 30575 1726867634.29281: running the handler 30575 1726867634.29284: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867634.29287: _low_level_execute_command(): starting 30575 1726867634.29289: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867634.30323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867634.30396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867634.30529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867634.30541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867634.30561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867634.30630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.32359: stdout chunk (state=3): >>>/root <<< 30575 1726867634.32421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867634.32576: stderr chunk (state=3): >>><<< 30575 1726867634.32582: stdout chunk (state=3): >>><<< 30575 1726867634.32605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867634.32621: _low_level_execute_command(): starting 30575 1726867634.32627: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296 `" && echo ansible-tmp-1726867634.3260782-33880-54573099155296="` echo /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296 `" ) && sleep 0' 30575 1726867634.33363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867634.33373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867634.33395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867634.33402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867634.33468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867634.33497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867634.33502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867634.33519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867634.33595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.35583: stdout chunk (state=3): >>>ansible-tmp-1726867634.3260782-33880-54573099155296=/root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296 <<< 30575 1726867634.35694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867634.35697: stdout chunk (state=3): >>><<< 30575 1726867634.35704: stderr chunk (state=3): >>><<< 30575 1726867634.35734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867634.3260782-33880-54573099155296=/root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867634.35757: variable 'ansible_module_compression' from source: unknown 30575 1726867634.35809: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867634.35962: variable 'ansible_facts' from source: unknown 30575 1726867634.36074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/AnsiballZ_command.py 30575 1726867634.36273: Sending initial data 30575 1726867634.36276: Sent initial data (155 bytes) 30575 1726867634.36756: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867634.36828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867634.36831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867634.36833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867634.36887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867634.36900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867634.36917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867634.36993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.38875: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867634.38921: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867634.39024: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4vc1pwuu /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/AnsiballZ_command.py <<< 30575 1726867634.39028: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4vc1pwuu" to remote "/root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/AnsiballZ_command.py" <<< 30575 1726867634.39955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867634.39988: stderr chunk (state=3): >>><<< 30575 1726867634.40144: stdout chunk (state=3): >>><<< 30575 1726867634.40148: done transferring module to remote 30575 1726867634.40150: _low_level_execute_command(): starting 30575 1726867634.40153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/ /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/AnsiballZ_command.py && sleep 0' 30575 1726867634.40709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867634.40729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867634.40818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867634.40839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867634.40859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867634.40875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867634.40957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.42987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867634.42990: stdout chunk (state=3): >>><<< 30575 1726867634.42992: stderr chunk (state=3): >>><<< 30575 1726867634.42995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867634.42998: _low_level_execute_command(): starting 30575 1726867634.43000: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/AnsiballZ_command.py && sleep 0' 30575 1726867634.43491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867634.43504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867634.43519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867634.43538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867634.43555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867634.43567: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867634.43661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867634.43674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867634.43700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867634.43771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.60627: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:27:14.587216", "end": "2024-09-20 17:27:14.603960", "delta": "0:00:00.016744", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867634.62108: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867634.62132: stdout chunk (state=3): >>><<< 30575 1726867634.62148: stderr chunk (state=3): >>><<< 30575 1726867634.62175: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:27:14.587216", "end": "2024-09-20 17:27:14.603960", "delta": "0:00:00.016744", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867634.62237: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867634.62252: _low_level_execute_command(): starting 30575 1726867634.62333: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867634.3260782-33880-54573099155296/ > /dev/null 2>&1 && sleep 0' 30575 1726867634.62901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867634.62996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867634.63027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867634.63044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867634.63127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867634.64963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867634.64972: stdout chunk (state=3): >>><<< 30575 1726867634.64985: stderr chunk (state=3): >>><<< 30575 1726867634.65004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867634.65018: handler run complete 30575 1726867634.65187: Evaluated conditional (False): False 30575 1726867634.65190: attempt loop complete, returning result 30575 1726867634.65192: _execute() done 30575 1726867634.65195: dumping result to json 30575 1726867634.65197: done dumping result, returning 30575 1726867634.65199: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-e081-a588-000000001668] 30575 1726867634.65201: sending task result for task 0affcac9-a3a5-e081-a588-000000001668 30575 1726867634.65275: done sending task result for task 0affcac9-a3a5-e081-a588-000000001668 30575 1726867634.65280: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016744", "end": "2024-09-20 17:27:14.603960", "rc": 1, "start": "2024-09-20 17:27:14.587216" } MSG: non-zero return code ...ignoring 30575 1726867634.65367: no more pending results, returning what we have 30575 1726867634.65371: results queue empty 30575 1726867634.65372: checking for any_errors_fatal 30575 1726867634.65380: done checking for any_errors_fatal 30575 1726867634.65381: checking for max_fail_percentage 30575 1726867634.65383: done checking for max_fail_percentage 30575 1726867634.65384: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.65385: done checking to see if all hosts have failed 30575 1726867634.65386: getting the remaining hosts for this loop 30575 1726867634.65388: done getting the remaining hosts for this loop 30575 1726867634.65391: getting the next task for host managed_node3 30575 1726867634.65586: done getting next task for host managed_node3 30575 1726867634.65589: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867634.65594: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.65599: getting variables 30575 1726867634.65600: in VariableManager get_vars() 30575 1726867634.65641: Calling all_inventory to load vars for managed_node3 30575 1726867634.65644: Calling groups_inventory to load vars for managed_node3 30575 1726867634.65647: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.65659: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.65662: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.65665: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.69035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.70626: done with get_vars() 30575 1726867634.70655: done getting variables 30575 1726867634.70723: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:27:14 -0400 (0:00:00.438) 0:01:10.085 ****** 30575 1726867634.70758: entering _queue_task() for managed_node3/set_fact 30575 1726867634.71258: worker is 1 (out of 1 available) 30575 1726867634.71271: exiting _queue_task() for managed_node3/set_fact 30575 1726867634.71392: done queuing things up, now waiting for results queue to drain 30575 1726867634.71394: waiting for pending results... 30575 1726867634.71855: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867634.71967: in run() - task 0affcac9-a3a5-e081-a588-000000001669 30575 1726867634.72033: variable 'ansible_search_path' from source: unknown 30575 1726867634.72038: variable 'ansible_search_path' from source: unknown 30575 1726867634.72129: calling self._execute() 30575 1726867634.72219: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.72284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.72288: variable 'omit' from source: magic vars 30575 1726867634.72773: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.72801: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.72976: variable 'nm_profile_exists' from source: set_fact 30575 1726867634.73000: Evaluated conditional (nm_profile_exists.rc == 0): False 30575 1726867634.73008: when evaluation is False, skipping this task 30575 1726867634.73021: _execute() done 30575 1726867634.73084: dumping result to json 30575 1726867634.73090: done dumping result, returning 30575 1726867634.73096: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-e081-a588-000000001669] 30575 1726867634.73098: sending task result for task 0affcac9-a3a5-e081-a588-000000001669 30575 1726867634.73174: done sending task result for task 0affcac9-a3a5-e081-a588-000000001669 30575 1726867634.73180: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30575 1726867634.73252: no more pending results, returning what we have 30575 1726867634.73258: results queue empty 30575 1726867634.73259: checking for any_errors_fatal 30575 1726867634.73270: done checking for any_errors_fatal 30575 1726867634.73271: checking for max_fail_percentage 30575 1726867634.73272: done checking for max_fail_percentage 30575 1726867634.73273: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.73275: done checking to see if all hosts have failed 30575 1726867634.73275: getting the remaining hosts for this loop 30575 1726867634.73279: done getting the remaining hosts for this loop 30575 1726867634.73283: getting the next task for host managed_node3 30575 1726867634.73313: done getting next task for host managed_node3 30575 1726867634.73317: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867634.73324: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.73329: getting variables 30575 1726867634.73332: in VariableManager get_vars() 30575 1726867634.73372: Calling all_inventory to load vars for managed_node3 30575 1726867634.73375: Calling groups_inventory to load vars for managed_node3 30575 1726867634.73413: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.73427: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.73431: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.73516: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.75449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.77195: done with get_vars() 30575 1726867634.77210: done getting variables 30575 1726867634.77251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867634.77338: variable 'profile' from source: play vars 30575 1726867634.77341: variable 'interface' from source: play vars 30575 1726867634.77385: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:27:14 -0400 (0:00:00.066) 0:01:10.151 ****** 30575 1726867634.77408: entering _queue_task() for managed_node3/command 30575 1726867634.77637: worker is 1 (out of 1 available) 30575 1726867634.77650: exiting _queue_task() for managed_node3/command 30575 1726867634.77666: done queuing things up, now waiting for results queue to drain 30575 1726867634.77668: waiting for pending results... 30575 1726867634.77847: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30575 1726867634.77944: in run() - task 0affcac9-a3a5-e081-a588-00000000166b 30575 1726867634.77955: variable 'ansible_search_path' from source: unknown 30575 1726867634.77959: variable 'ansible_search_path' from source: unknown 30575 1726867634.77989: calling self._execute() 30575 1726867634.78060: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.78065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.78073: variable 'omit' from source: magic vars 30575 1726867634.78624: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.78628: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.78631: variable 'profile_stat' from source: set_fact 30575 1726867634.78633: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867634.78635: when evaluation is False, skipping this task 30575 1726867634.78638: _execute() done 30575 1726867634.78640: dumping result to json 30575 1726867634.78642: done dumping result, returning 30575 1726867634.78645: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000166b] 30575 1726867634.78647: sending task result for task 0affcac9-a3a5-e081-a588-00000000166b 30575 1726867634.78713: done sending task result for task 0affcac9-a3a5-e081-a588-00000000166b 30575 1726867634.78716: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867634.78768: no more pending results, returning what we have 30575 1726867634.78771: results queue empty 30575 1726867634.78772: checking for any_errors_fatal 30575 1726867634.78782: done checking for any_errors_fatal 30575 1726867634.78783: checking for max_fail_percentage 30575 1726867634.78784: done checking for max_fail_percentage 30575 1726867634.78785: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.78786: done checking to see if all hosts have failed 30575 1726867634.78787: getting the remaining hosts for this loop 30575 1726867634.78788: done getting the remaining hosts for this loop 30575 1726867634.78791: getting the next task for host managed_node3 30575 1726867634.78799: done getting next task for host managed_node3 30575 1726867634.78802: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867634.78806: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.78809: getting variables 30575 1726867634.78812: in VariableManager get_vars() 30575 1726867634.78876: Calling all_inventory to load vars for managed_node3 30575 1726867634.78881: Calling groups_inventory to load vars for managed_node3 30575 1726867634.78884: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.78892: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.78894: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.78896: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.80791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.81695: done with get_vars() 30575 1726867634.81710: done getting variables 30575 1726867634.81751: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867634.81827: variable 'profile' from source: play vars 30575 1726867634.81830: variable 'interface' from source: play vars 30575 1726867634.81868: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:27:14 -0400 (0:00:00.044) 0:01:10.196 ****** 30575 1726867634.81895: entering _queue_task() for managed_node3/set_fact 30575 1726867634.82155: worker is 1 (out of 1 available) 30575 1726867634.82167: exiting _queue_task() for managed_node3/set_fact 30575 1726867634.82183: done queuing things up, now waiting for results queue to drain 30575 1726867634.82185: waiting for pending results... 30575 1726867634.82541: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30575 1726867634.82687: in run() - task 0affcac9-a3a5-e081-a588-00000000166c 30575 1726867634.82692: variable 'ansible_search_path' from source: unknown 30575 1726867634.82695: variable 'ansible_search_path' from source: unknown 30575 1726867634.82712: calling self._execute() 30575 1726867634.82808: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.82843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.83080: variable 'omit' from source: magic vars 30575 1726867634.83668: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.83689: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.83872: variable 'profile_stat' from source: set_fact 30575 1726867634.83897: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867634.83906: when evaluation is False, skipping this task 30575 1726867634.83918: _execute() done 30575 1726867634.83935: dumping result to json 30575 1726867634.83953: done dumping result, returning 30575 1726867634.83965: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000166c] 30575 1726867634.83979: sending task result for task 0affcac9-a3a5-e081-a588-00000000166c skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867634.84129: no more pending results, returning what we have 30575 1726867634.84133: results queue empty 30575 1726867634.84133: checking for any_errors_fatal 30575 1726867634.84145: done checking for any_errors_fatal 30575 1726867634.84146: checking for max_fail_percentage 30575 1726867634.84147: done checking for max_fail_percentage 30575 1726867634.84148: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.84150: done checking to see if all hosts have failed 30575 1726867634.84151: getting the remaining hosts for this loop 30575 1726867634.84152: done getting the remaining hosts for this loop 30575 1726867634.84157: getting the next task for host managed_node3 30575 1726867634.84165: done getting next task for host managed_node3 30575 1726867634.84168: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30575 1726867634.84173: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.84179: getting variables 30575 1726867634.84180: in VariableManager get_vars() 30575 1726867634.84220: Calling all_inventory to load vars for managed_node3 30575 1726867634.84222: Calling groups_inventory to load vars for managed_node3 30575 1726867634.84225: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.84235: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.84237: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.84239: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.84791: done sending task result for task 0affcac9-a3a5-e081-a588-00000000166c 30575 1726867634.84794: WORKER PROCESS EXITING 30575 1726867634.85070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.86373: done with get_vars() 30575 1726867634.86389: done getting variables 30575 1726867634.86441: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867634.86563: variable 'profile' from source: play vars 30575 1726867634.86567: variable 'interface' from source: play vars 30575 1726867634.86623: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:27:14 -0400 (0:00:00.047) 0:01:10.244 ****** 30575 1726867634.86654: entering _queue_task() for managed_node3/command 30575 1726867634.86964: worker is 1 (out of 1 available) 30575 1726867634.86980: exiting _queue_task() for managed_node3/command 30575 1726867634.86997: done queuing things up, now waiting for results queue to drain 30575 1726867634.86999: waiting for pending results... 30575 1726867634.87415: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30575 1726867634.87462: in run() - task 0affcac9-a3a5-e081-a588-00000000166d 30575 1726867634.87467: variable 'ansible_search_path' from source: unknown 30575 1726867634.87470: variable 'ansible_search_path' from source: unknown 30575 1726867634.87506: calling self._execute() 30575 1726867634.87575: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.87581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.87592: variable 'omit' from source: magic vars 30575 1726867634.87859: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.87865: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.87953: variable 'profile_stat' from source: set_fact 30575 1726867634.87962: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867634.87966: when evaluation is False, skipping this task 30575 1726867634.87969: _execute() done 30575 1726867634.87971: dumping result to json 30575 1726867634.87974: done dumping result, returning 30575 1726867634.87981: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000166d] 30575 1726867634.87987: sending task result for task 0affcac9-a3a5-e081-a588-00000000166d 30575 1726867634.88065: done sending task result for task 0affcac9-a3a5-e081-a588-00000000166d 30575 1726867634.88068: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867634.88129: no more pending results, returning what we have 30575 1726867634.88132: results queue empty 30575 1726867634.88133: checking for any_errors_fatal 30575 1726867634.88138: done checking for any_errors_fatal 30575 1726867634.88139: checking for max_fail_percentage 30575 1726867634.88141: done checking for max_fail_percentage 30575 1726867634.88141: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.88142: done checking to see if all hosts have failed 30575 1726867634.88143: getting the remaining hosts for this loop 30575 1726867634.88144: done getting the remaining hosts for this loop 30575 1726867634.88147: getting the next task for host managed_node3 30575 1726867634.88153: done getting next task for host managed_node3 30575 1726867634.88155: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30575 1726867634.88160: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.88163: getting variables 30575 1726867634.88164: in VariableManager get_vars() 30575 1726867634.88199: Calling all_inventory to load vars for managed_node3 30575 1726867634.88202: Calling groups_inventory to load vars for managed_node3 30575 1726867634.88205: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.88215: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.88217: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.88220: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.88964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.90250: done with get_vars() 30575 1726867634.90264: done getting variables 30575 1726867634.90306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867634.90378: variable 'profile' from source: play vars 30575 1726867634.90382: variable 'interface' from source: play vars 30575 1726867634.90420: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:27:14 -0400 (0:00:00.037) 0:01:10.281 ****** 30575 1726867634.90443: entering _queue_task() for managed_node3/set_fact 30575 1726867634.90643: worker is 1 (out of 1 available) 30575 1726867634.90656: exiting _queue_task() for managed_node3/set_fact 30575 1726867634.90668: done queuing things up, now waiting for results queue to drain 30575 1726867634.90670: waiting for pending results... 30575 1726867634.90845: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30575 1726867634.90938: in run() - task 0affcac9-a3a5-e081-a588-00000000166e 30575 1726867634.90950: variable 'ansible_search_path' from source: unknown 30575 1726867634.90953: variable 'ansible_search_path' from source: unknown 30575 1726867634.90987: calling self._execute() 30575 1726867634.91059: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.91063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.91071: variable 'omit' from source: magic vars 30575 1726867634.91350: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.91359: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.91443: variable 'profile_stat' from source: set_fact 30575 1726867634.91452: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867634.91455: when evaluation is False, skipping this task 30575 1726867634.91458: _execute() done 30575 1726867634.91461: dumping result to json 30575 1726867634.91464: done dumping result, returning 30575 1726867634.91471: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000166e] 30575 1726867634.91476: sending task result for task 0affcac9-a3a5-e081-a588-00000000166e 30575 1726867634.91555: done sending task result for task 0affcac9-a3a5-e081-a588-00000000166e 30575 1726867634.91559: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867634.91608: no more pending results, returning what we have 30575 1726867634.91612: results queue empty 30575 1726867634.91612: checking for any_errors_fatal 30575 1726867634.91621: done checking for any_errors_fatal 30575 1726867634.91622: checking for max_fail_percentage 30575 1726867634.91623: done checking for max_fail_percentage 30575 1726867634.91624: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.91625: done checking to see if all hosts have failed 30575 1726867634.91626: getting the remaining hosts for this loop 30575 1726867634.91627: done getting the remaining hosts for this loop 30575 1726867634.91630: getting the next task for host managed_node3 30575 1726867634.91639: done getting next task for host managed_node3 30575 1726867634.91641: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30575 1726867634.91645: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.91649: getting variables 30575 1726867634.91650: in VariableManager get_vars() 30575 1726867634.91684: Calling all_inventory to load vars for managed_node3 30575 1726867634.91687: Calling groups_inventory to load vars for managed_node3 30575 1726867634.91690: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.91700: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.91702: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.91705: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.92587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.93430: done with get_vars() 30575 1726867634.93444: done getting variables 30575 1726867634.93487: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867634.93558: variable 'profile' from source: play vars 30575 1726867634.93561: variable 'interface' from source: play vars 30575 1726867634.93601: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 17:27:14 -0400 (0:00:00.031) 0:01:10.313 ****** 30575 1726867634.93624: entering _queue_task() for managed_node3/assert 30575 1726867634.93807: worker is 1 (out of 1 available) 30575 1726867634.93821: exiting _queue_task() for managed_node3/assert 30575 1726867634.93832: done queuing things up, now waiting for results queue to drain 30575 1726867634.93835: waiting for pending results... 30575 1726867634.94006: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' 30575 1726867634.94091: in run() - task 0affcac9-a3a5-e081-a588-0000000015d5 30575 1726867634.94103: variable 'ansible_search_path' from source: unknown 30575 1726867634.94106: variable 'ansible_search_path' from source: unknown 30575 1726867634.94134: calling self._execute() 30575 1726867634.94205: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.94209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.94218: variable 'omit' from source: magic vars 30575 1726867634.94483: variable 'ansible_distribution_major_version' from source: facts 30575 1726867634.94494: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867634.94501: variable 'omit' from source: magic vars 30575 1726867634.94536: variable 'omit' from source: magic vars 30575 1726867634.94601: variable 'profile' from source: play vars 30575 1726867634.94606: variable 'interface' from source: play vars 30575 1726867634.94654: variable 'interface' from source: play vars 30575 1726867634.94667: variable 'omit' from source: magic vars 30575 1726867634.94699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867634.94730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867634.94745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867634.94758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867634.94768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867634.94792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867634.94795: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.94798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.94869: Set connection var ansible_pipelining to False 30575 1726867634.94872: Set connection var ansible_shell_type to sh 30575 1726867634.94879: Set connection var ansible_shell_executable to /bin/sh 30575 1726867634.94884: Set connection var ansible_timeout to 10 30575 1726867634.94889: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867634.94895: Set connection var ansible_connection to ssh 30575 1726867634.94915: variable 'ansible_shell_executable' from source: unknown 30575 1726867634.94918: variable 'ansible_connection' from source: unknown 30575 1726867634.94920: variable 'ansible_module_compression' from source: unknown 30575 1726867634.94924: variable 'ansible_shell_type' from source: unknown 30575 1726867634.94926: variable 'ansible_shell_executable' from source: unknown 30575 1726867634.94928: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867634.94930: variable 'ansible_pipelining' from source: unknown 30575 1726867634.94933: variable 'ansible_timeout' from source: unknown 30575 1726867634.94935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867634.95033: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867634.95041: variable 'omit' from source: magic vars 30575 1726867634.95052: starting attempt loop 30575 1726867634.95055: running the handler 30575 1726867634.95132: variable 'lsr_net_profile_exists' from source: set_fact 30575 1726867634.95136: Evaluated conditional (not lsr_net_profile_exists): True 30575 1726867634.95141: handler run complete 30575 1726867634.95155: attempt loop complete, returning result 30575 1726867634.95158: _execute() done 30575 1726867634.95160: dumping result to json 30575 1726867634.95163: done dumping result, returning 30575 1726867634.95174: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' [0affcac9-a3a5-e081-a588-0000000015d5] 30575 1726867634.95178: sending task result for task 0affcac9-a3a5-e081-a588-0000000015d5 30575 1726867634.95252: done sending task result for task 0affcac9-a3a5-e081-a588-0000000015d5 30575 1726867634.95254: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867634.95324: no more pending results, returning what we have 30575 1726867634.95327: results queue empty 30575 1726867634.95328: checking for any_errors_fatal 30575 1726867634.95332: done checking for any_errors_fatal 30575 1726867634.95333: checking for max_fail_percentage 30575 1726867634.95334: done checking for max_fail_percentage 30575 1726867634.95335: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.95336: done checking to see if all hosts have failed 30575 1726867634.95337: getting the remaining hosts for this loop 30575 1726867634.95338: done getting the remaining hosts for this loop 30575 1726867634.95341: getting the next task for host managed_node3 30575 1726867634.95349: done getting next task for host managed_node3 30575 1726867634.95351: ^ task is: TASK: Conditional asserts 30575 1726867634.95354: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.95357: getting variables 30575 1726867634.95359: in VariableManager get_vars() 30575 1726867634.95394: Calling all_inventory to load vars for managed_node3 30575 1726867634.95396: Calling groups_inventory to load vars for managed_node3 30575 1726867634.95399: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867634.95407: Calling all_plugins_play to load vars for managed_node3 30575 1726867634.95410: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867634.95414: Calling groups_plugins_play to load vars for managed_node3 30575 1726867634.96150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867634.97012: done with get_vars() 30575 1726867634.97026: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:27:14 -0400 (0:00:00.034) 0:01:10.348 ****** 30575 1726867634.97091: entering _queue_task() for managed_node3/include_tasks 30575 1726867634.97278: worker is 1 (out of 1 available) 30575 1726867634.97291: exiting _queue_task() for managed_node3/include_tasks 30575 1726867634.97303: done queuing things up, now waiting for results queue to drain 30575 1726867634.97305: waiting for pending results... 30575 1726867634.97469: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30575 1726867634.97542: in run() - task 0affcac9-a3a5-e081-a588-00000000100b 30575 1726867634.97556: variable 'ansible_search_path' from source: unknown 30575 1726867634.97560: variable 'ansible_search_path' from source: unknown 30575 1726867634.97750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867634.99402: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867634.99446: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867634.99471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867634.99502: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867634.99522: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867634.99580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867634.99603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867634.99624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867634.99649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867634.99660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867634.99767: dumping result to json 30575 1726867634.99771: done dumping result, returning 30575 1726867634.99774: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-e081-a588-00000000100b] 30575 1726867634.99785: sending task result for task 0affcac9-a3a5-e081-a588-00000000100b 30575 1726867634.99874: done sending task result for task 0affcac9-a3a5-e081-a588-00000000100b 30575 1726867634.99879: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 30575 1726867634.99924: no more pending results, returning what we have 30575 1726867634.99927: results queue empty 30575 1726867634.99928: checking for any_errors_fatal 30575 1726867634.99935: done checking for any_errors_fatal 30575 1726867634.99935: checking for max_fail_percentage 30575 1726867634.99937: done checking for max_fail_percentage 30575 1726867634.99938: checking to see if all hosts have failed and the running result is not ok 30575 1726867634.99939: done checking to see if all hosts have failed 30575 1726867634.99939: getting the remaining hosts for this loop 30575 1726867634.99941: done getting the remaining hosts for this loop 30575 1726867634.99944: getting the next task for host managed_node3 30575 1726867634.99951: done getting next task for host managed_node3 30575 1726867634.99953: ^ task is: TASK: Success in test '{{ lsr_description }}' 30575 1726867634.99956: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867634.99959: getting variables 30575 1726867634.99960: in VariableManager get_vars() 30575 1726867634.99991: Calling all_inventory to load vars for managed_node3 30575 1726867634.99993: Calling groups_inventory to load vars for managed_node3 30575 1726867634.99996: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.00004: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.00007: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.00009: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.00861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.01720: done with get_vars() 30575 1726867635.01734: done getting variables 30575 1726867635.01772: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867635.01851: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:27:15 -0400 (0:00:00.047) 0:01:10.396 ****** 30575 1726867635.01872: entering _queue_task() for managed_node3/debug 30575 1726867635.02074: worker is 1 (out of 1 available) 30575 1726867635.02089: exiting _queue_task() for managed_node3/debug 30575 1726867635.02102: done queuing things up, now waiting for results queue to drain 30575 1726867635.02103: waiting for pending results... 30575 1726867635.02279: running TaskExecutor() for managed_node3/TASK: Success in test 'I can remove an existing profile without taking it down' 30575 1726867635.02352: in run() - task 0affcac9-a3a5-e081-a588-00000000100c 30575 1726867635.02365: variable 'ansible_search_path' from source: unknown 30575 1726867635.02370: variable 'ansible_search_path' from source: unknown 30575 1726867635.02402: calling self._execute() 30575 1726867635.02469: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.02473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.02486: variable 'omit' from source: magic vars 30575 1726867635.02755: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.02764: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.02770: variable 'omit' from source: magic vars 30575 1726867635.02801: variable 'omit' from source: magic vars 30575 1726867635.02869: variable 'lsr_description' from source: include params 30575 1726867635.02887: variable 'omit' from source: magic vars 30575 1726867635.02920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867635.02949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.02964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867635.02979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.02993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.03021: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.03024: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.03026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.03095: Set connection var ansible_pipelining to False 30575 1726867635.03098: Set connection var ansible_shell_type to sh 30575 1726867635.03101: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.03107: Set connection var ansible_timeout to 10 30575 1726867635.03112: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.03120: Set connection var ansible_connection to ssh 30575 1726867635.03139: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.03142: variable 'ansible_connection' from source: unknown 30575 1726867635.03144: variable 'ansible_module_compression' from source: unknown 30575 1726867635.03146: variable 'ansible_shell_type' from source: unknown 30575 1726867635.03149: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.03151: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.03155: variable 'ansible_pipelining' from source: unknown 30575 1726867635.03157: variable 'ansible_timeout' from source: unknown 30575 1726867635.03161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.03263: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.03272: variable 'omit' from source: magic vars 30575 1726867635.03278: starting attempt loop 30575 1726867635.03282: running the handler 30575 1726867635.03320: handler run complete 30575 1726867635.03332: attempt loop complete, returning result 30575 1726867635.03335: _execute() done 30575 1726867635.03337: dumping result to json 30575 1726867635.03339: done dumping result, returning 30575 1726867635.03347: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can remove an existing profile without taking it down' [0affcac9-a3a5-e081-a588-00000000100c] 30575 1726867635.03352: sending task result for task 0affcac9-a3a5-e081-a588-00000000100c 30575 1726867635.03429: done sending task result for task 0affcac9-a3a5-e081-a588-00000000100c 30575 1726867635.03433: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 30575 1726867635.03474: no more pending results, returning what we have 30575 1726867635.03479: results queue empty 30575 1726867635.03480: checking for any_errors_fatal 30575 1726867635.03488: done checking for any_errors_fatal 30575 1726867635.03489: checking for max_fail_percentage 30575 1726867635.03490: done checking for max_fail_percentage 30575 1726867635.03491: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.03492: done checking to see if all hosts have failed 30575 1726867635.03493: getting the remaining hosts for this loop 30575 1726867635.03494: done getting the remaining hosts for this loop 30575 1726867635.03498: getting the next task for host managed_node3 30575 1726867635.03506: done getting next task for host managed_node3 30575 1726867635.03508: ^ task is: TASK: Cleanup 30575 1726867635.03511: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.03516: getting variables 30575 1726867635.03517: in VariableManager get_vars() 30575 1726867635.03548: Calling all_inventory to load vars for managed_node3 30575 1726867635.03550: Calling groups_inventory to load vars for managed_node3 30575 1726867635.03553: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.03561: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.03564: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.03566: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.04318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.05267: done with get_vars() 30575 1726867635.05283: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:27:15 -0400 (0:00:00.034) 0:01:10.430 ****** 30575 1726867635.05343: entering _queue_task() for managed_node3/include_tasks 30575 1726867635.05528: worker is 1 (out of 1 available) 30575 1726867635.05543: exiting _queue_task() for managed_node3/include_tasks 30575 1726867635.05557: done queuing things up, now waiting for results queue to drain 30575 1726867635.05558: waiting for pending results... 30575 1726867635.05724: running TaskExecutor() for managed_node3/TASK: Cleanup 30575 1726867635.05792: in run() - task 0affcac9-a3a5-e081-a588-000000001010 30575 1726867635.05803: variable 'ansible_search_path' from source: unknown 30575 1726867635.05806: variable 'ansible_search_path' from source: unknown 30575 1726867635.05842: variable 'lsr_cleanup' from source: include params 30575 1726867635.05981: variable 'lsr_cleanup' from source: include params 30575 1726867635.06036: variable 'omit' from source: magic vars 30575 1726867635.06131: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.06139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.06148: variable 'omit' from source: magic vars 30575 1726867635.06308: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.06318: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.06329: variable 'item' from source: unknown 30575 1726867635.06369: variable 'item' from source: unknown 30575 1726867635.06394: variable 'item' from source: unknown 30575 1726867635.06440: variable 'item' from source: unknown 30575 1726867635.06559: dumping result to json 30575 1726867635.06561: done dumping result, returning 30575 1726867635.06563: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-e081-a588-000000001010] 30575 1726867635.06565: sending task result for task 0affcac9-a3a5-e081-a588-000000001010 30575 1726867635.06603: done sending task result for task 0affcac9-a3a5-e081-a588-000000001010 30575 1726867635.06606: WORKER PROCESS EXITING 30575 1726867635.06626: no more pending results, returning what we have 30575 1726867635.06631: in VariableManager get_vars() 30575 1726867635.06662: Calling all_inventory to load vars for managed_node3 30575 1726867635.06664: Calling groups_inventory to load vars for managed_node3 30575 1726867635.06667: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.06675: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.06679: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.06682: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.07409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.08255: done with get_vars() 30575 1726867635.08268: variable 'ansible_search_path' from source: unknown 30575 1726867635.08269: variable 'ansible_search_path' from source: unknown 30575 1726867635.08295: we have included files to process 30575 1726867635.08296: generating all_blocks data 30575 1726867635.08297: done generating all_blocks data 30575 1726867635.08301: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867635.08302: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867635.08303: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867635.08425: done processing included file 30575 1726867635.08427: iterating over new_blocks loaded from include file 30575 1726867635.08428: in VariableManager get_vars() 30575 1726867635.08437: done with get_vars() 30575 1726867635.08438: filtering new block on tags 30575 1726867635.08456: done filtering new block on tags 30575 1726867635.08458: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30575 1726867635.08460: extending task lists for all hosts with included blocks 30575 1726867635.09149: done extending task lists 30575 1726867635.09150: done processing included files 30575 1726867635.09151: results queue empty 30575 1726867635.09151: checking for any_errors_fatal 30575 1726867635.09153: done checking for any_errors_fatal 30575 1726867635.09154: checking for max_fail_percentage 30575 1726867635.09155: done checking for max_fail_percentage 30575 1726867635.09155: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.09156: done checking to see if all hosts have failed 30575 1726867635.09156: getting the remaining hosts for this loop 30575 1726867635.09157: done getting the remaining hosts for this loop 30575 1726867635.09159: getting the next task for host managed_node3 30575 1726867635.09161: done getting next task for host managed_node3 30575 1726867635.09163: ^ task is: TASK: Cleanup profile and device 30575 1726867635.09165: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.09166: getting variables 30575 1726867635.09167: in VariableManager get_vars() 30575 1726867635.09175: Calling all_inventory to load vars for managed_node3 30575 1726867635.09176: Calling groups_inventory to load vars for managed_node3 30575 1726867635.09179: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.09183: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.09184: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.09186: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.13242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.14066: done with get_vars() 30575 1726867635.14083: done getting variables 30575 1726867635.14115: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 17:27:15 -0400 (0:00:00.087) 0:01:10.518 ****** 30575 1726867635.14133: entering _queue_task() for managed_node3/shell 30575 1726867635.14399: worker is 1 (out of 1 available) 30575 1726867635.14413: exiting _queue_task() for managed_node3/shell 30575 1726867635.14427: done queuing things up, now waiting for results queue to drain 30575 1726867635.14430: waiting for pending results... 30575 1726867635.14613: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30575 1726867635.14703: in run() - task 0affcac9-a3a5-e081-a588-0000000016ad 30575 1726867635.14717: variable 'ansible_search_path' from source: unknown 30575 1726867635.14723: variable 'ansible_search_path' from source: unknown 30575 1726867635.14750: calling self._execute() 30575 1726867635.14827: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.14831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.14838: variable 'omit' from source: magic vars 30575 1726867635.15124: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.15132: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.15138: variable 'omit' from source: magic vars 30575 1726867635.15171: variable 'omit' from source: magic vars 30575 1726867635.15275: variable 'interface' from source: play vars 30575 1726867635.15291: variable 'omit' from source: magic vars 30575 1726867635.15329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867635.15355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.15371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867635.15386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.15396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.15425: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.15428: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.15431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.15497: Set connection var ansible_pipelining to False 30575 1726867635.15500: Set connection var ansible_shell_type to sh 30575 1726867635.15506: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.15511: Set connection var ansible_timeout to 10 30575 1726867635.15519: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.15532: Set connection var ansible_connection to ssh 30575 1726867635.15546: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.15550: variable 'ansible_connection' from source: unknown 30575 1726867635.15552: variable 'ansible_module_compression' from source: unknown 30575 1726867635.15554: variable 'ansible_shell_type' from source: unknown 30575 1726867635.15557: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.15559: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.15561: variable 'ansible_pipelining' from source: unknown 30575 1726867635.15564: variable 'ansible_timeout' from source: unknown 30575 1726867635.15567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.15670: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.15681: variable 'omit' from source: magic vars 30575 1726867635.15687: starting attempt loop 30575 1726867635.15689: running the handler 30575 1726867635.15698: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.15718: _low_level_execute_command(): starting 30575 1726867635.15725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867635.16503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.16543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.18237: stdout chunk (state=3): >>>/root <<< 30575 1726867635.18338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.18493: stderr chunk (state=3): >>><<< 30575 1726867635.18497: stdout chunk (state=3): >>><<< 30575 1726867635.18500: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867635.18504: _low_level_execute_command(): starting 30575 1726867635.18507: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744 `" && echo ansible-tmp-1726867635.183958-33924-21011417758744="` echo /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744 `" ) && sleep 0' 30575 1726867635.19069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.19094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867635.19126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.19201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.21093: stdout chunk (state=3): >>>ansible-tmp-1726867635.183958-33924-21011417758744=/root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744 <<< 30575 1726867635.21200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.21223: stderr chunk (state=3): >>><<< 30575 1726867635.21226: stdout chunk (state=3): >>><<< 30575 1726867635.21240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867635.183958-33924-21011417758744=/root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867635.21266: variable 'ansible_module_compression' from source: unknown 30575 1726867635.21307: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867635.21340: variable 'ansible_facts' from source: unknown 30575 1726867635.21400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/AnsiballZ_command.py 30575 1726867635.21495: Sending initial data 30575 1726867635.21499: Sent initial data (154 bytes) 30575 1726867635.21920: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.21925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.21938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.21984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.22002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.22043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.23591: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867635.23596: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867635.23637: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867635.23680: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7mjmysed /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/AnsiballZ_command.py <<< 30575 1726867635.23688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/AnsiballZ_command.py" <<< 30575 1726867635.23724: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7mjmysed" to remote "/root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/AnsiballZ_command.py" <<< 30575 1726867635.24257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.24298: stderr chunk (state=3): >>><<< 30575 1726867635.24301: stdout chunk (state=3): >>><<< 30575 1726867635.24333: done transferring module to remote 30575 1726867635.24343: _low_level_execute_command(): starting 30575 1726867635.24348: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/ /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/AnsiballZ_command.py && sleep 0' 30575 1726867635.24766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867635.24799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867635.24803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867635.24805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867635.24807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.24858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.24861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.24912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.26673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.26697: stderr chunk (state=3): >>><<< 30575 1726867635.26701: stdout chunk (state=3): >>><<< 30575 1726867635.26715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867635.26718: _low_level_execute_command(): starting 30575 1726867635.26722: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/AnsiballZ_command.py && sleep 0' 30575 1726867635.27156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867635.27160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867635.27162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867635.27164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.27218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.27225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867635.27227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.27274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.47628: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (12e4c575-fa21-4cd0-afc7-2cb6b45b6219) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:27:15.421855", "end": "2024-09-20 17:27:15.473542", "delta": "0:00:00.051687", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867635.50486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867635.50490: stdout chunk (state=3): >>><<< 30575 1726867635.50493: stderr chunk (state=3): >>><<< 30575 1726867635.50496: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (12e4c575-fa21-4cd0-afc7-2cb6b45b6219) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:27:15.421855", "end": "2024-09-20 17:27:15.473542", "delta": "0:00:00.051687", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867635.50498: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867635.50501: _low_level_execute_command(): starting 30575 1726867635.50504: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867635.183958-33924-21011417758744/ > /dev/null 2>&1 && sleep 0' 30575 1726867635.51001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867635.51009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867635.51022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.51043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867635.51057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867635.51063: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867635.51073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.51154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.51166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.51183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867635.51215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.51282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.53215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.53218: stdout chunk (state=3): >>><<< 30575 1726867635.53221: stderr chunk (state=3): >>><<< 30575 1726867635.53224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867635.53227: handler run complete 30575 1726867635.53457: Evaluated conditional (False): False 30575 1726867635.53460: attempt loop complete, returning result 30575 1726867635.53462: _execute() done 30575 1726867635.53464: dumping result to json 30575 1726867635.53466: done dumping result, returning 30575 1726867635.53468: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcac9-a3a5-e081-a588-0000000016ad] 30575 1726867635.53469: sending task result for task 0affcac9-a3a5-e081-a588-0000000016ad 30575 1726867635.53748: done sending task result for task 0affcac9-a3a5-e081-a588-0000000016ad 30575 1726867635.53752: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.051687", "end": "2024-09-20 17:27:15.473542", "rc": 0, "start": "2024-09-20 17:27:15.421855" } STDOUT: Connection 'statebr' (12e4c575-fa21-4cd0-afc7-2cb6b45b6219) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 30575 1726867635.53823: no more pending results, returning what we have 30575 1726867635.53827: results queue empty 30575 1726867635.53828: checking for any_errors_fatal 30575 1726867635.53829: done checking for any_errors_fatal 30575 1726867635.53830: checking for max_fail_percentage 30575 1726867635.53832: done checking for max_fail_percentage 30575 1726867635.53833: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.53834: done checking to see if all hosts have failed 30575 1726867635.53834: getting the remaining hosts for this loop 30575 1726867635.53836: done getting the remaining hosts for this loop 30575 1726867635.53839: getting the next task for host managed_node3 30575 1726867635.53849: done getting next task for host managed_node3 30575 1726867635.53852: ^ task is: TASK: Include the task 'run_test.yml' 30575 1726867635.53854: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.53857: getting variables 30575 1726867635.53859: in VariableManager get_vars() 30575 1726867635.53894: Calling all_inventory to load vars for managed_node3 30575 1726867635.53897: Calling groups_inventory to load vars for managed_node3 30575 1726867635.53900: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.53910: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.53913: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.53916: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.55123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.56008: done with get_vars() 30575 1726867635.56025: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Friday 20 September 2024 17:27:15 -0400 (0:00:00.419) 0:01:10.938 ****** 30575 1726867635.56091: entering _queue_task() for managed_node3/include_tasks 30575 1726867635.56307: worker is 1 (out of 1 available) 30575 1726867635.56320: exiting _queue_task() for managed_node3/include_tasks 30575 1726867635.56333: done queuing things up, now waiting for results queue to drain 30575 1726867635.56336: waiting for pending results... 30575 1726867635.56540: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30575 1726867635.56607: in run() - task 0affcac9-a3a5-e081-a588-000000000015 30575 1726867635.56619: variable 'ansible_search_path' from source: unknown 30575 1726867635.56647: calling self._execute() 30575 1726867635.56722: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.56726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.56735: variable 'omit' from source: magic vars 30575 1726867635.57021: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.57025: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.57029: _execute() done 30575 1726867635.57032: dumping result to json 30575 1726867635.57036: done dumping result, returning 30575 1726867635.57043: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-e081-a588-000000000015] 30575 1726867635.57048: sending task result for task 0affcac9-a3a5-e081-a588-000000000015 30575 1726867635.57169: done sending task result for task 0affcac9-a3a5-e081-a588-000000000015 30575 1726867635.57172: WORKER PROCESS EXITING 30575 1726867635.57203: no more pending results, returning what we have 30575 1726867635.57207: in VariableManager get_vars() 30575 1726867635.57245: Calling all_inventory to load vars for managed_node3 30575 1726867635.57247: Calling groups_inventory to load vars for managed_node3 30575 1726867635.57250: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.57261: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.57263: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.57266: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.58278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.59382: done with get_vars() 30575 1726867635.59395: variable 'ansible_search_path' from source: unknown 30575 1726867635.59404: we have included files to process 30575 1726867635.59405: generating all_blocks data 30575 1726867635.59406: done generating all_blocks data 30575 1726867635.59412: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867635.59413: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867635.59414: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867635.59665: in VariableManager get_vars() 30575 1726867635.59680: done with get_vars() 30575 1726867635.59706: in VariableManager get_vars() 30575 1726867635.59720: done with get_vars() 30575 1726867635.59744: in VariableManager get_vars() 30575 1726867635.59754: done with get_vars() 30575 1726867635.59781: in VariableManager get_vars() 30575 1726867635.59792: done with get_vars() 30575 1726867635.59819: in VariableManager get_vars() 30575 1726867635.59829: done with get_vars() 30575 1726867635.60081: in VariableManager get_vars() 30575 1726867635.60092: done with get_vars() 30575 1726867635.60100: done processing included file 30575 1726867635.60101: iterating over new_blocks loaded from include file 30575 1726867635.60102: in VariableManager get_vars() 30575 1726867635.60112: done with get_vars() 30575 1726867635.60114: filtering new block on tags 30575 1726867635.60170: done filtering new block on tags 30575 1726867635.60172: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30575 1726867635.60175: extending task lists for all hosts with included blocks 30575 1726867635.60197: done extending task lists 30575 1726867635.60198: done processing included files 30575 1726867635.60198: results queue empty 30575 1726867635.60199: checking for any_errors_fatal 30575 1726867635.60202: done checking for any_errors_fatal 30575 1726867635.60202: checking for max_fail_percentage 30575 1726867635.60203: done checking for max_fail_percentage 30575 1726867635.60203: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.60204: done checking to see if all hosts have failed 30575 1726867635.60204: getting the remaining hosts for this loop 30575 1726867635.60205: done getting the remaining hosts for this loop 30575 1726867635.60207: getting the next task for host managed_node3 30575 1726867635.60209: done getting next task for host managed_node3 30575 1726867635.60213: ^ task is: TASK: TEST: {{ lsr_description }} 30575 1726867635.60215: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.60216: getting variables 30575 1726867635.60217: in VariableManager get_vars() 30575 1726867635.60224: Calling all_inventory to load vars for managed_node3 30575 1726867635.60225: Calling groups_inventory to load vars for managed_node3 30575 1726867635.60226: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.60230: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.60231: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.60233: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.60846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.61728: done with get_vars() 30575 1726867635.61742: done getting variables 30575 1726867635.61766: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867635.61847: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:27:15 -0400 (0:00:00.057) 0:01:10.996 ****** 30575 1726867635.61866: entering _queue_task() for managed_node3/debug 30575 1726867635.62074: worker is 1 (out of 1 available) 30575 1726867635.62088: exiting _queue_task() for managed_node3/debug 30575 1726867635.62100: done queuing things up, now waiting for results queue to drain 30575 1726867635.62102: waiting for pending results... 30575 1726867635.62276: running TaskExecutor() for managed_node3/TASK: TEST: I can take a profile down that is absent 30575 1726867635.62344: in run() - task 0affcac9-a3a5-e081-a588-000000001744 30575 1726867635.62355: variable 'ansible_search_path' from source: unknown 30575 1726867635.62358: variable 'ansible_search_path' from source: unknown 30575 1726867635.62390: calling self._execute() 30575 1726867635.62457: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.62463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.62472: variable 'omit' from source: magic vars 30575 1726867635.62740: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.62749: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.62755: variable 'omit' from source: magic vars 30575 1726867635.62781: variable 'omit' from source: magic vars 30575 1726867635.62848: variable 'lsr_description' from source: include params 30575 1726867635.62862: variable 'omit' from source: magic vars 30575 1726867635.62896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867635.62923: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.62940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867635.62952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.62963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.62990: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.62993: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.62996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.63061: Set connection var ansible_pipelining to False 30575 1726867635.63064: Set connection var ansible_shell_type to sh 30575 1726867635.63067: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.63073: Set connection var ansible_timeout to 10 30575 1726867635.63079: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.63091: Set connection var ansible_connection to ssh 30575 1726867635.63104: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.63108: variable 'ansible_connection' from source: unknown 30575 1726867635.63112: variable 'ansible_module_compression' from source: unknown 30575 1726867635.63115: variable 'ansible_shell_type' from source: unknown 30575 1726867635.63117: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.63120: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.63122: variable 'ansible_pipelining' from source: unknown 30575 1726867635.63124: variable 'ansible_timeout' from source: unknown 30575 1726867635.63126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.63229: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.63238: variable 'omit' from source: magic vars 30575 1726867635.63243: starting attempt loop 30575 1726867635.63246: running the handler 30575 1726867635.63284: handler run complete 30575 1726867635.63293: attempt loop complete, returning result 30575 1726867635.63296: _execute() done 30575 1726867635.63298: dumping result to json 30575 1726867635.63308: done dumping result, returning 30575 1726867635.63313: done running TaskExecutor() for managed_node3/TASK: TEST: I can take a profile down that is absent [0affcac9-a3a5-e081-a588-000000001744] 30575 1726867635.63316: sending task result for task 0affcac9-a3a5-e081-a588-000000001744 30575 1726867635.63399: done sending task result for task 0affcac9-a3a5-e081-a588-000000001744 30575 1726867635.63402: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can take a profile down that is absent ########## 30575 1726867635.63455: no more pending results, returning what we have 30575 1726867635.63459: results queue empty 30575 1726867635.63460: checking for any_errors_fatal 30575 1726867635.63461: done checking for any_errors_fatal 30575 1726867635.63461: checking for max_fail_percentage 30575 1726867635.63463: done checking for max_fail_percentage 30575 1726867635.63463: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.63464: done checking to see if all hosts have failed 30575 1726867635.63465: getting the remaining hosts for this loop 30575 1726867635.63466: done getting the remaining hosts for this loop 30575 1726867635.63469: getting the next task for host managed_node3 30575 1726867635.63476: done getting next task for host managed_node3 30575 1726867635.63481: ^ task is: TASK: Show item 30575 1726867635.63483: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.63487: getting variables 30575 1726867635.63489: in VariableManager get_vars() 30575 1726867635.63519: Calling all_inventory to load vars for managed_node3 30575 1726867635.63522: Calling groups_inventory to load vars for managed_node3 30575 1726867635.63524: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.63532: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.63535: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.63537: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.64274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.65383: done with get_vars() 30575 1726867635.65400: done getting variables 30575 1726867635.65448: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:27:15 -0400 (0:00:00.036) 0:01:11.032 ****** 30575 1726867635.65472: entering _queue_task() for managed_node3/debug 30575 1726867635.65721: worker is 1 (out of 1 available) 30575 1726867635.65734: exiting _queue_task() for managed_node3/debug 30575 1726867635.65745: done queuing things up, now waiting for results queue to drain 30575 1726867635.65747: waiting for pending results... 30575 1726867635.66192: running TaskExecutor() for managed_node3/TASK: Show item 30575 1726867635.66196: in run() - task 0affcac9-a3a5-e081-a588-000000001745 30575 1726867635.66198: variable 'ansible_search_path' from source: unknown 30575 1726867635.66201: variable 'ansible_search_path' from source: unknown 30575 1726867635.66203: variable 'omit' from source: magic vars 30575 1726867635.66333: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.66345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.66358: variable 'omit' from source: magic vars 30575 1726867635.66704: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.66723: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.66738: variable 'omit' from source: magic vars 30575 1726867635.66763: variable 'omit' from source: magic vars 30575 1726867635.66793: variable 'item' from source: unknown 30575 1726867635.66844: variable 'item' from source: unknown 30575 1726867635.66860: variable 'omit' from source: magic vars 30575 1726867635.66890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867635.66918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.66934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867635.66951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.66961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.66987: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.66991: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.66993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.67061: Set connection var ansible_pipelining to False 30575 1726867635.67064: Set connection var ansible_shell_type to sh 30575 1726867635.67069: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.67074: Set connection var ansible_timeout to 10 30575 1726867635.67081: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.67088: Set connection var ansible_connection to ssh 30575 1726867635.67103: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.67106: variable 'ansible_connection' from source: unknown 30575 1726867635.67108: variable 'ansible_module_compression' from source: unknown 30575 1726867635.67113: variable 'ansible_shell_type' from source: unknown 30575 1726867635.67115: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.67118: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.67120: variable 'ansible_pipelining' from source: unknown 30575 1726867635.67122: variable 'ansible_timeout' from source: unknown 30575 1726867635.67124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.67220: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.67229: variable 'omit' from source: magic vars 30575 1726867635.67234: starting attempt loop 30575 1726867635.67237: running the handler 30575 1726867635.67272: variable 'lsr_description' from source: include params 30575 1726867635.67320: variable 'lsr_description' from source: include params 30575 1726867635.67328: handler run complete 30575 1726867635.67341: attempt loop complete, returning result 30575 1726867635.67353: variable 'item' from source: unknown 30575 1726867635.67401: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 30575 1726867635.67544: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.67547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.67549: variable 'omit' from source: magic vars 30575 1726867635.67668: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.67672: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.67674: variable 'omit' from source: magic vars 30575 1726867635.67676: variable 'omit' from source: magic vars 30575 1726867635.67680: variable 'item' from source: unknown 30575 1726867635.67702: variable 'item' from source: unknown 30575 1726867635.67715: variable 'omit' from source: magic vars 30575 1726867635.67728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.67735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.67741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.67750: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.67753: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.67755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.67803: Set connection var ansible_pipelining to False 30575 1726867635.67806: Set connection var ansible_shell_type to sh 30575 1726867635.67808: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.67886: Set connection var ansible_timeout to 10 30575 1726867635.67889: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.67892: Set connection var ansible_connection to ssh 30575 1726867635.67894: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.67896: variable 'ansible_connection' from source: unknown 30575 1726867635.67899: variable 'ansible_module_compression' from source: unknown 30575 1726867635.67901: variable 'ansible_shell_type' from source: unknown 30575 1726867635.67903: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.67905: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.67907: variable 'ansible_pipelining' from source: unknown 30575 1726867635.67909: variable 'ansible_timeout' from source: unknown 30575 1726867635.67914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.67916: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.67919: variable 'omit' from source: magic vars 30575 1726867635.67921: starting attempt loop 30575 1726867635.67923: running the handler 30575 1726867635.67936: variable 'lsr_setup' from source: include params 30575 1726867635.67987: variable 'lsr_setup' from source: include params 30575 1726867635.68023: handler run complete 30575 1726867635.68032: attempt loop complete, returning result 30575 1726867635.68043: variable 'item' from source: unknown 30575 1726867635.68087: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 30575 1726867635.68169: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.68172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.68179: variable 'omit' from source: magic vars 30575 1726867635.68280: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.68283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.68288: variable 'omit' from source: magic vars 30575 1726867635.68298: variable 'omit' from source: magic vars 30575 1726867635.68328: variable 'item' from source: unknown 30575 1726867635.68370: variable 'item' from source: unknown 30575 1726867635.68383: variable 'omit' from source: magic vars 30575 1726867635.68397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.68403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.68409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.68418: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.68422: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.68425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.68468: Set connection var ansible_pipelining to False 30575 1726867635.68471: Set connection var ansible_shell_type to sh 30575 1726867635.68473: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.68480: Set connection var ansible_timeout to 10 30575 1726867635.68485: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.68491: Set connection var ansible_connection to ssh 30575 1726867635.68505: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.68508: variable 'ansible_connection' from source: unknown 30575 1726867635.68512: variable 'ansible_module_compression' from source: unknown 30575 1726867635.68515: variable 'ansible_shell_type' from source: unknown 30575 1726867635.68517: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.68519: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.68521: variable 'ansible_pipelining' from source: unknown 30575 1726867635.68523: variable 'ansible_timeout' from source: unknown 30575 1726867635.68526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.68582: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.68589: variable 'omit' from source: magic vars 30575 1726867635.68591: starting attempt loop 30575 1726867635.68594: running the handler 30575 1726867635.68609: variable 'lsr_test' from source: include params 30575 1726867635.68653: variable 'lsr_test' from source: include params 30575 1726867635.68666: handler run complete 30575 1726867635.68675: attempt loop complete, returning result 30575 1726867635.68687: variable 'item' from source: unknown 30575 1726867635.68729: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30575 1726867635.68801: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.68804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.68807: variable 'omit' from source: magic vars 30575 1726867635.68902: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.68906: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.68912: variable 'omit' from source: magic vars 30575 1726867635.68982: variable 'omit' from source: magic vars 30575 1726867635.68984: variable 'item' from source: unknown 30575 1726867635.68995: variable 'item' from source: unknown 30575 1726867635.69006: variable 'omit' from source: magic vars 30575 1726867635.69020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.69030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.69033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.69040: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.69043: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.69045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.69087: Set connection var ansible_pipelining to False 30575 1726867635.69090: Set connection var ansible_shell_type to sh 30575 1726867635.69092: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.69098: Set connection var ansible_timeout to 10 30575 1726867635.69103: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.69108: Set connection var ansible_connection to ssh 30575 1726867635.69124: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.69126: variable 'ansible_connection' from source: unknown 30575 1726867635.69129: variable 'ansible_module_compression' from source: unknown 30575 1726867635.69132: variable 'ansible_shell_type' from source: unknown 30575 1726867635.69135: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.69137: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.69139: variable 'ansible_pipelining' from source: unknown 30575 1726867635.69141: variable 'ansible_timeout' from source: unknown 30575 1726867635.69148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.69200: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.69206: variable 'omit' from source: magic vars 30575 1726867635.69209: starting attempt loop 30575 1726867635.69214: running the handler 30575 1726867635.69227: variable 'lsr_assert' from source: include params 30575 1726867635.69271: variable 'lsr_assert' from source: include params 30575 1726867635.69284: handler run complete 30575 1726867635.69293: attempt loop complete, returning result 30575 1726867635.69304: variable 'item' from source: unknown 30575 1726867635.69346: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 30575 1726867635.69420: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.69424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.69426: variable 'omit' from source: magic vars 30575 1726867635.69553: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.69557: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.69563: variable 'omit' from source: magic vars 30575 1726867635.69572: variable 'omit' from source: magic vars 30575 1726867635.69600: variable 'item' from source: unknown 30575 1726867635.69643: variable 'item' from source: unknown 30575 1726867635.69656: variable 'omit' from source: magic vars 30575 1726867635.69669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.69675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.69683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.69691: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.69694: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.69696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.69738: Set connection var ansible_pipelining to False 30575 1726867635.69741: Set connection var ansible_shell_type to sh 30575 1726867635.69744: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.69749: Set connection var ansible_timeout to 10 30575 1726867635.69755: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.69763: Set connection var ansible_connection to ssh 30575 1726867635.69776: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.69781: variable 'ansible_connection' from source: unknown 30575 1726867635.69783: variable 'ansible_module_compression' from source: unknown 30575 1726867635.69785: variable 'ansible_shell_type' from source: unknown 30575 1726867635.69787: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.69789: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.69794: variable 'ansible_pipelining' from source: unknown 30575 1726867635.69796: variable 'ansible_timeout' from source: unknown 30575 1726867635.69800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.69855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.69861: variable 'omit' from source: magic vars 30575 1726867635.69865: starting attempt loop 30575 1726867635.69868: running the handler 30575 1726867635.69884: variable 'lsr_assert_when' from source: include params 30575 1726867635.69927: variable 'lsr_assert_when' from source: include params 30575 1726867635.69986: variable 'network_provider' from source: set_fact 30575 1726867635.70008: handler run complete 30575 1726867635.70020: attempt loop complete, returning result 30575 1726867635.70030: variable 'item' from source: unknown 30575 1726867635.70071: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30575 1726867635.70146: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.70149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.70152: variable 'omit' from source: magic vars 30575 1726867635.70244: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.70247: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.70250: variable 'omit' from source: magic vars 30575 1726867635.70263: variable 'omit' from source: magic vars 30575 1726867635.70290: variable 'item' from source: unknown 30575 1726867635.70332: variable 'item' from source: unknown 30575 1726867635.70342: variable 'omit' from source: magic vars 30575 1726867635.70355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.70361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.70366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.70375: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.70385: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.70388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.70432: Set connection var ansible_pipelining to False 30575 1726867635.70435: Set connection var ansible_shell_type to sh 30575 1726867635.70437: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.70442: Set connection var ansible_timeout to 10 30575 1726867635.70447: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.70453: Set connection var ansible_connection to ssh 30575 1726867635.70469: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.70472: variable 'ansible_connection' from source: unknown 30575 1726867635.70474: variable 'ansible_module_compression' from source: unknown 30575 1726867635.70478: variable 'ansible_shell_type' from source: unknown 30575 1726867635.70481: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.70483: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.70491: variable 'ansible_pipelining' from source: unknown 30575 1726867635.70493: variable 'ansible_timeout' from source: unknown 30575 1726867635.70495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.70549: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.70555: variable 'omit' from source: magic vars 30575 1726867635.70558: starting attempt loop 30575 1726867635.70560: running the handler 30575 1726867635.70574: variable 'lsr_fail_debug' from source: play vars 30575 1726867635.70623: variable 'lsr_fail_debug' from source: play vars 30575 1726867635.70634: handler run complete 30575 1726867635.70644: attempt loop complete, returning result 30575 1726867635.70654: variable 'item' from source: unknown 30575 1726867635.70696: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30575 1726867635.70762: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.70765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.70775: variable 'omit' from source: magic vars 30575 1726867635.70870: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.70876: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.70880: variable 'omit' from source: magic vars 30575 1726867635.70891: variable 'omit' from source: magic vars 30575 1726867635.70918: variable 'item' from source: unknown 30575 1726867635.70961: variable 'item' from source: unknown 30575 1726867635.70980: variable 'omit' from source: magic vars 30575 1726867635.70994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.71001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.71005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.71016: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.71019: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.71021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.71064: Set connection var ansible_pipelining to False 30575 1726867635.71068: Set connection var ansible_shell_type to sh 30575 1726867635.71070: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.71075: Set connection var ansible_timeout to 10 30575 1726867635.71082: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.71088: Set connection var ansible_connection to ssh 30575 1726867635.71103: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.71106: variable 'ansible_connection' from source: unknown 30575 1726867635.71108: variable 'ansible_module_compression' from source: unknown 30575 1726867635.71111: variable 'ansible_shell_type' from source: unknown 30575 1726867635.71116: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.71118: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.71122: variable 'ansible_pipelining' from source: unknown 30575 1726867635.71124: variable 'ansible_timeout' from source: unknown 30575 1726867635.71128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.71183: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.71189: variable 'omit' from source: magic vars 30575 1726867635.71192: starting attempt loop 30575 1726867635.71194: running the handler 30575 1726867635.71210: variable 'lsr_cleanup' from source: include params 30575 1726867635.71254: variable 'lsr_cleanup' from source: include params 30575 1726867635.71265: handler run complete 30575 1726867635.71275: attempt loop complete, returning result 30575 1726867635.71287: variable 'item' from source: unknown 30575 1726867635.71331: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30575 1726867635.71401: dumping result to json 30575 1726867635.71405: done dumping result, returning 30575 1726867635.71407: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-e081-a588-000000001745] 30575 1726867635.71410: sending task result for task 0affcac9-a3a5-e081-a588-000000001745 30575 1726867635.71451: done sending task result for task 0affcac9-a3a5-e081-a588-000000001745 30575 1726867635.71453: WORKER PROCESS EXITING 30575 1726867635.71503: no more pending results, returning what we have 30575 1726867635.71507: results queue empty 30575 1726867635.71507: checking for any_errors_fatal 30575 1726867635.71515: done checking for any_errors_fatal 30575 1726867635.71516: checking for max_fail_percentage 30575 1726867635.71517: done checking for max_fail_percentage 30575 1726867635.71518: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.71519: done checking to see if all hosts have failed 30575 1726867635.71520: getting the remaining hosts for this loop 30575 1726867635.71521: done getting the remaining hosts for this loop 30575 1726867635.71525: getting the next task for host managed_node3 30575 1726867635.71532: done getting next task for host managed_node3 30575 1726867635.71534: ^ task is: TASK: Include the task 'show_interfaces.yml' 30575 1726867635.71537: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.71541: getting variables 30575 1726867635.71542: in VariableManager get_vars() 30575 1726867635.71581: Calling all_inventory to load vars for managed_node3 30575 1726867635.71584: Calling groups_inventory to load vars for managed_node3 30575 1726867635.71587: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.71597: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.71599: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.71602: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.72504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.73338: done with get_vars() 30575 1726867635.73352: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:27:15 -0400 (0:00:00.079) 0:01:11.111 ****** 30575 1726867635.73414: entering _queue_task() for managed_node3/include_tasks 30575 1726867635.73629: worker is 1 (out of 1 available) 30575 1726867635.73643: exiting _queue_task() for managed_node3/include_tasks 30575 1726867635.73657: done queuing things up, now waiting for results queue to drain 30575 1726867635.73659: waiting for pending results... 30575 1726867635.73830: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30575 1726867635.73918: in run() - task 0affcac9-a3a5-e081-a588-000000001746 30575 1726867635.73930: variable 'ansible_search_path' from source: unknown 30575 1726867635.73934: variable 'ansible_search_path' from source: unknown 30575 1726867635.73963: calling self._execute() 30575 1726867635.74033: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.74037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.74046: variable 'omit' from source: magic vars 30575 1726867635.74338: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.74348: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.74353: _execute() done 30575 1726867635.74358: dumping result to json 30575 1726867635.74361: done dumping result, returning 30575 1726867635.74367: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-e081-a588-000000001746] 30575 1726867635.74373: sending task result for task 0affcac9-a3a5-e081-a588-000000001746 30575 1726867635.74457: done sending task result for task 0affcac9-a3a5-e081-a588-000000001746 30575 1726867635.74460: WORKER PROCESS EXITING 30575 1726867635.74489: no more pending results, returning what we have 30575 1726867635.74494: in VariableManager get_vars() 30575 1726867635.74536: Calling all_inventory to load vars for managed_node3 30575 1726867635.74538: Calling groups_inventory to load vars for managed_node3 30575 1726867635.74542: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.74554: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.74556: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.74559: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.75343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.76213: done with get_vars() 30575 1726867635.76225: variable 'ansible_search_path' from source: unknown 30575 1726867635.76226: variable 'ansible_search_path' from source: unknown 30575 1726867635.76252: we have included files to process 30575 1726867635.76253: generating all_blocks data 30575 1726867635.76254: done generating all_blocks data 30575 1726867635.76258: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867635.76258: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867635.76260: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867635.76331: in VariableManager get_vars() 30575 1726867635.76345: done with get_vars() 30575 1726867635.76421: done processing included file 30575 1726867635.76422: iterating over new_blocks loaded from include file 30575 1726867635.76423: in VariableManager get_vars() 30575 1726867635.76433: done with get_vars() 30575 1726867635.76434: filtering new block on tags 30575 1726867635.76456: done filtering new block on tags 30575 1726867635.76457: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30575 1726867635.76461: extending task lists for all hosts with included blocks 30575 1726867635.76713: done extending task lists 30575 1726867635.76714: done processing included files 30575 1726867635.76714: results queue empty 30575 1726867635.76715: checking for any_errors_fatal 30575 1726867635.76719: done checking for any_errors_fatal 30575 1726867635.76720: checking for max_fail_percentage 30575 1726867635.76720: done checking for max_fail_percentage 30575 1726867635.76721: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.76722: done checking to see if all hosts have failed 30575 1726867635.76722: getting the remaining hosts for this loop 30575 1726867635.76723: done getting the remaining hosts for this loop 30575 1726867635.76724: getting the next task for host managed_node3 30575 1726867635.76728: done getting next task for host managed_node3 30575 1726867635.76730: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30575 1726867635.76732: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.76734: getting variables 30575 1726867635.76734: in VariableManager get_vars() 30575 1726867635.76741: Calling all_inventory to load vars for managed_node3 30575 1726867635.76743: Calling groups_inventory to load vars for managed_node3 30575 1726867635.76744: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.76748: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.76749: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.76751: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.77427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.78252: done with get_vars() 30575 1726867635.78266: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:27:15 -0400 (0:00:00.049) 0:01:11.160 ****** 30575 1726867635.78320: entering _queue_task() for managed_node3/include_tasks 30575 1726867635.78576: worker is 1 (out of 1 available) 30575 1726867635.78591: exiting _queue_task() for managed_node3/include_tasks 30575 1726867635.78605: done queuing things up, now waiting for results queue to drain 30575 1726867635.78607: waiting for pending results... 30575 1726867635.78793: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30575 1726867635.78871: in run() - task 0affcac9-a3a5-e081-a588-00000000176d 30575 1726867635.78884: variable 'ansible_search_path' from source: unknown 30575 1726867635.78887: variable 'ansible_search_path' from source: unknown 30575 1726867635.78920: calling self._execute() 30575 1726867635.78991: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.78996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.79004: variable 'omit' from source: magic vars 30575 1726867635.79297: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.79307: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.79316: _execute() done 30575 1726867635.79319: dumping result to json 30575 1726867635.79324: done dumping result, returning 30575 1726867635.79331: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-e081-a588-00000000176d] 30575 1726867635.79337: sending task result for task 0affcac9-a3a5-e081-a588-00000000176d 30575 1726867635.79421: done sending task result for task 0affcac9-a3a5-e081-a588-00000000176d 30575 1726867635.79425: WORKER PROCESS EXITING 30575 1726867635.79453: no more pending results, returning what we have 30575 1726867635.79458: in VariableManager get_vars() 30575 1726867635.79504: Calling all_inventory to load vars for managed_node3 30575 1726867635.79506: Calling groups_inventory to load vars for managed_node3 30575 1726867635.79510: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.79524: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.79527: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.79529: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.80361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.81587: done with get_vars() 30575 1726867635.81604: variable 'ansible_search_path' from source: unknown 30575 1726867635.81606: variable 'ansible_search_path' from source: unknown 30575 1726867635.81640: we have included files to process 30575 1726867635.81641: generating all_blocks data 30575 1726867635.81642: done generating all_blocks data 30575 1726867635.81644: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867635.81645: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867635.81647: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867635.81901: done processing included file 30575 1726867635.81902: iterating over new_blocks loaded from include file 30575 1726867635.81903: in VariableManager get_vars() 30575 1726867635.81914: done with get_vars() 30575 1726867635.81916: filtering new block on tags 30575 1726867635.81939: done filtering new block on tags 30575 1726867635.81941: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30575 1726867635.81944: extending task lists for all hosts with included blocks 30575 1726867635.82036: done extending task lists 30575 1726867635.82037: done processing included files 30575 1726867635.82037: results queue empty 30575 1726867635.82038: checking for any_errors_fatal 30575 1726867635.82039: done checking for any_errors_fatal 30575 1726867635.82040: checking for max_fail_percentage 30575 1726867635.82041: done checking for max_fail_percentage 30575 1726867635.82041: checking to see if all hosts have failed and the running result is not ok 30575 1726867635.82042: done checking to see if all hosts have failed 30575 1726867635.82042: getting the remaining hosts for this loop 30575 1726867635.82043: done getting the remaining hosts for this loop 30575 1726867635.82045: getting the next task for host managed_node3 30575 1726867635.82048: done getting next task for host managed_node3 30575 1726867635.82049: ^ task is: TASK: Gather current interface info 30575 1726867635.82052: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867635.82054: getting variables 30575 1726867635.82054: in VariableManager get_vars() 30575 1726867635.82062: Calling all_inventory to load vars for managed_node3 30575 1726867635.82063: Calling groups_inventory to load vars for managed_node3 30575 1726867635.82065: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867635.82068: Calling all_plugins_play to load vars for managed_node3 30575 1726867635.82070: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867635.82071: Calling groups_plugins_play to load vars for managed_node3 30575 1726867635.82730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867635.83569: done with get_vars() 30575 1726867635.83586: done getting variables 30575 1726867635.83621: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:27:15 -0400 (0:00:00.053) 0:01:11.214 ****** 30575 1726867635.83648: entering _queue_task() for managed_node3/command 30575 1726867635.83943: worker is 1 (out of 1 available) 30575 1726867635.83957: exiting _queue_task() for managed_node3/command 30575 1726867635.83969: done queuing things up, now waiting for results queue to drain 30575 1726867635.83971: waiting for pending results... 30575 1726867635.84156: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30575 1726867635.84243: in run() - task 0affcac9-a3a5-e081-a588-0000000017a8 30575 1726867635.84254: variable 'ansible_search_path' from source: unknown 30575 1726867635.84258: variable 'ansible_search_path' from source: unknown 30575 1726867635.84288: calling self._execute() 30575 1726867635.84357: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.84361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.84369: variable 'omit' from source: magic vars 30575 1726867635.84681: variable 'ansible_distribution_major_version' from source: facts 30575 1726867635.84883: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867635.84886: variable 'omit' from source: magic vars 30575 1726867635.84888: variable 'omit' from source: magic vars 30575 1726867635.84890: variable 'omit' from source: magic vars 30575 1726867635.84894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867635.84897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867635.84899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867635.84902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.84918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867635.84950: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867635.84958: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.84965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.85067: Set connection var ansible_pipelining to False 30575 1726867635.85075: Set connection var ansible_shell_type to sh 30575 1726867635.85102: Set connection var ansible_shell_executable to /bin/sh 30575 1726867635.85116: Set connection var ansible_timeout to 10 30575 1726867635.85122: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867635.85132: Set connection var ansible_connection to ssh 30575 1726867635.85152: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.85155: variable 'ansible_connection' from source: unknown 30575 1726867635.85158: variable 'ansible_module_compression' from source: unknown 30575 1726867635.85160: variable 'ansible_shell_type' from source: unknown 30575 1726867635.85163: variable 'ansible_shell_executable' from source: unknown 30575 1726867635.85165: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867635.85167: variable 'ansible_pipelining' from source: unknown 30575 1726867635.85169: variable 'ansible_timeout' from source: unknown 30575 1726867635.85174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867635.85274: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867635.85286: variable 'omit' from source: magic vars 30575 1726867635.85291: starting attempt loop 30575 1726867635.85294: running the handler 30575 1726867635.85305: _low_level_execute_command(): starting 30575 1726867635.85315: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867635.85805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.85812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867635.85816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.85869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.85873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867635.85876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.85926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.87624: stdout chunk (state=3): >>>/root <<< 30575 1726867635.87723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.87749: stderr chunk (state=3): >>><<< 30575 1726867635.87752: stdout chunk (state=3): >>><<< 30575 1726867635.87771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867635.87784: _low_level_execute_command(): starting 30575 1726867635.87790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183 `" && echo ansible-tmp-1726867635.8777058-33969-3062940333183="` echo /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183 `" ) && sleep 0' 30575 1726867635.88199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.88203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.88205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867635.88217: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867635.88219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.88255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.88259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.88313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.90194: stdout chunk (state=3): >>>ansible-tmp-1726867635.8777058-33969-3062940333183=/root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183 <<< 30575 1726867635.90294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.90321: stderr chunk (state=3): >>><<< 30575 1726867635.90323: stdout chunk (state=3): >>><<< 30575 1726867635.90337: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867635.8777058-33969-3062940333183=/root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867635.90361: variable 'ansible_module_compression' from source: unknown 30575 1726867635.90401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867635.90432: variable 'ansible_facts' from source: unknown 30575 1726867635.90489: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/AnsiballZ_command.py 30575 1726867635.90580: Sending initial data 30575 1726867635.90583: Sent initial data (154 bytes) 30575 1726867635.91004: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.91007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867635.91012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867635.91015: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.91017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.91061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.91064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.91114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.92657: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867635.92664: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867635.92700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867635.92744: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpizyavlsm /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/AnsiballZ_command.py <<< 30575 1726867635.92753: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/AnsiballZ_command.py" <<< 30575 1726867635.92789: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpizyavlsm" to remote "/root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/AnsiballZ_command.py" <<< 30575 1726867635.93330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.93366: stderr chunk (state=3): >>><<< 30575 1726867635.93369: stdout chunk (state=3): >>><<< 30575 1726867635.93407: done transferring module to remote 30575 1726867635.93416: _low_level_execute_command(): starting 30575 1726867635.93420: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/ /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/AnsiballZ_command.py && sleep 0' 30575 1726867635.93836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.93839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867635.93842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.93844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867635.93846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867635.93851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.93893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867635.93897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.93945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867635.95714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867635.95718: stdout chunk (state=3): >>><<< 30575 1726867635.95720: stderr chunk (state=3): >>><<< 30575 1726867635.95732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867635.95735: _low_level_execute_command(): starting 30575 1726867635.95740: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/AnsiballZ_command.py && sleep 0' 30575 1726867635.96170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.96173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867635.96175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867635.96179: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867635.96182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867635.96226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867635.96241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867635.96288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867636.11851: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:27:16.113081", "end": "2024-09-20 17:27:16.116372", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867636.13374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867636.13381: stdout chunk (state=3): >>><<< 30575 1726867636.13384: stderr chunk (state=3): >>><<< 30575 1726867636.13402: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:27:16.113081", "end": "2024-09-20 17:27:16.116372", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867636.13434: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867636.13440: _low_level_execute_command(): starting 30575 1726867636.13446: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867635.8777058-33969-3062940333183/ > /dev/null 2>&1 && sleep 0' 30575 1726867636.14120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867636.14135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867636.14174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867636.14245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867636.16114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867636.16219: stderr chunk (state=3): >>><<< 30575 1726867636.16223: stdout chunk (state=3): >>><<< 30575 1726867636.16226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867636.16228: handler run complete 30575 1726867636.16247: Evaluated conditional (False): False 30575 1726867636.16270: attempt loop complete, returning result 30575 1726867636.16281: _execute() done 30575 1726867636.16289: dumping result to json 30575 1726867636.16373: done dumping result, returning 30575 1726867636.16376: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-e081-a588-0000000017a8] 30575 1726867636.16380: sending task result for task 0affcac9-a3a5-e081-a588-0000000017a8 30575 1726867636.16462: done sending task result for task 0affcac9-a3a5-e081-a588-0000000017a8 30575 1726867636.16466: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003291", "end": "2024-09-20 17:27:16.116372", "rc": 0, "start": "2024-09-20 17:27:16.113081" } STDOUT: bonding_masters eth0 lo 30575 1726867636.16554: no more pending results, returning what we have 30575 1726867636.16559: results queue empty 30575 1726867636.16560: checking for any_errors_fatal 30575 1726867636.16562: done checking for any_errors_fatal 30575 1726867636.16562: checking for max_fail_percentage 30575 1726867636.16564: done checking for max_fail_percentage 30575 1726867636.16565: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.16567: done checking to see if all hosts have failed 30575 1726867636.16567: getting the remaining hosts for this loop 30575 1726867636.16569: done getting the remaining hosts for this loop 30575 1726867636.16573: getting the next task for host managed_node3 30575 1726867636.16691: done getting next task for host managed_node3 30575 1726867636.16695: ^ task is: TASK: Set current_interfaces 30575 1726867636.16702: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.16708: getting variables 30575 1726867636.16713: in VariableManager get_vars() 30575 1726867636.16759: Calling all_inventory to load vars for managed_node3 30575 1726867636.16762: Calling groups_inventory to load vars for managed_node3 30575 1726867636.16766: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.16887: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.16892: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.16897: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.18668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.20507: done with get_vars() 30575 1726867636.20539: done getting variables 30575 1726867636.20618: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:27:16 -0400 (0:00:00.370) 0:01:11.584 ****** 30575 1726867636.20653: entering _queue_task() for managed_node3/set_fact 30575 1726867636.21204: worker is 1 (out of 1 available) 30575 1726867636.21219: exiting _queue_task() for managed_node3/set_fact 30575 1726867636.21231: done queuing things up, now waiting for results queue to drain 30575 1726867636.21233: waiting for pending results... 30575 1726867636.21476: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30575 1726867636.21683: in run() - task 0affcac9-a3a5-e081-a588-0000000017a9 30575 1726867636.21688: variable 'ansible_search_path' from source: unknown 30575 1726867636.21690: variable 'ansible_search_path' from source: unknown 30575 1726867636.21693: calling self._execute() 30575 1726867636.21743: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.21757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.21774: variable 'omit' from source: magic vars 30575 1726867636.22396: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.22417: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.22430: variable 'omit' from source: magic vars 30575 1726867636.22503: variable 'omit' from source: magic vars 30575 1726867636.22629: variable '_current_interfaces' from source: set_fact 30575 1726867636.22716: variable 'omit' from source: magic vars 30575 1726867636.22773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867636.22820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867636.22846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867636.22874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867636.22899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867636.22988: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867636.22994: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.22998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.23076: Set connection var ansible_pipelining to False 30575 1726867636.23094: Set connection var ansible_shell_type to sh 30575 1726867636.23115: Set connection var ansible_shell_executable to /bin/sh 30575 1726867636.23129: Set connection var ansible_timeout to 10 30575 1726867636.23140: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867636.23202: Set connection var ansible_connection to ssh 30575 1726867636.23207: variable 'ansible_shell_executable' from source: unknown 30575 1726867636.23215: variable 'ansible_connection' from source: unknown 30575 1726867636.23218: variable 'ansible_module_compression' from source: unknown 30575 1726867636.23220: variable 'ansible_shell_type' from source: unknown 30575 1726867636.23221: variable 'ansible_shell_executable' from source: unknown 30575 1726867636.23223: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.23225: variable 'ansible_pipelining' from source: unknown 30575 1726867636.23227: variable 'ansible_timeout' from source: unknown 30575 1726867636.23236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.23390: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867636.23409: variable 'omit' from source: magic vars 30575 1726867636.23436: starting attempt loop 30575 1726867636.23444: running the handler 30575 1726867636.23530: handler run complete 30575 1726867636.23534: attempt loop complete, returning result 30575 1726867636.23536: _execute() done 30575 1726867636.23541: dumping result to json 30575 1726867636.23543: done dumping result, returning 30575 1726867636.23546: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-e081-a588-0000000017a9] 30575 1726867636.23548: sending task result for task 0affcac9-a3a5-e081-a588-0000000017a9 ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30575 1726867636.23689: no more pending results, returning what we have 30575 1726867636.23693: results queue empty 30575 1726867636.23694: checking for any_errors_fatal 30575 1726867636.23703: done checking for any_errors_fatal 30575 1726867636.23704: checking for max_fail_percentage 30575 1726867636.23706: done checking for max_fail_percentage 30575 1726867636.23707: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.23708: done checking to see if all hosts have failed 30575 1726867636.23709: getting the remaining hosts for this loop 30575 1726867636.23713: done getting the remaining hosts for this loop 30575 1726867636.23718: getting the next task for host managed_node3 30575 1726867636.23729: done getting next task for host managed_node3 30575 1726867636.23733: ^ task is: TASK: Show current_interfaces 30575 1726867636.23738: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.23743: getting variables 30575 1726867636.23745: in VariableManager get_vars() 30575 1726867636.23900: Calling all_inventory to load vars for managed_node3 30575 1726867636.23903: Calling groups_inventory to load vars for managed_node3 30575 1726867636.23907: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.23922: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.23926: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.23929: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.24656: done sending task result for task 0affcac9-a3a5-e081-a588-0000000017a9 30575 1726867636.24659: WORKER PROCESS EXITING 30575 1726867636.27131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.29024: done with get_vars() 30575 1726867636.29057: done getting variables 30575 1726867636.29131: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:27:16 -0400 (0:00:00.085) 0:01:11.669 ****** 30575 1726867636.29166: entering _queue_task() for managed_node3/debug 30575 1726867636.29571: worker is 1 (out of 1 available) 30575 1726867636.29587: exiting _queue_task() for managed_node3/debug 30575 1726867636.29601: done queuing things up, now waiting for results queue to drain 30575 1726867636.29603: waiting for pending results... 30575 1726867636.29870: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30575 1726867636.29973: in run() - task 0affcac9-a3a5-e081-a588-00000000176e 30575 1726867636.29996: variable 'ansible_search_path' from source: unknown 30575 1726867636.30000: variable 'ansible_search_path' from source: unknown 30575 1726867636.30104: calling self._execute() 30575 1726867636.30121: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.30128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.30140: variable 'omit' from source: magic vars 30575 1726867636.30495: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.30507: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.30515: variable 'omit' from source: magic vars 30575 1726867636.30559: variable 'omit' from source: magic vars 30575 1726867636.30652: variable 'current_interfaces' from source: set_fact 30575 1726867636.30685: variable 'omit' from source: magic vars 30575 1726867636.30758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867636.30762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867636.30781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867636.30798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867636.30812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867636.30867: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867636.30870: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.30872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.30940: Set connection var ansible_pipelining to False 30575 1726867636.30943: Set connection var ansible_shell_type to sh 30575 1726867636.30974: Set connection var ansible_shell_executable to /bin/sh 30575 1726867636.30979: Set connection var ansible_timeout to 10 30575 1726867636.30982: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867636.30984: Set connection var ansible_connection to ssh 30575 1726867636.30986: variable 'ansible_shell_executable' from source: unknown 30575 1726867636.30991: variable 'ansible_connection' from source: unknown 30575 1726867636.30994: variable 'ansible_module_compression' from source: unknown 30575 1726867636.30996: variable 'ansible_shell_type' from source: unknown 30575 1726867636.31086: variable 'ansible_shell_executable' from source: unknown 30575 1726867636.31089: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.31091: variable 'ansible_pipelining' from source: unknown 30575 1726867636.31093: variable 'ansible_timeout' from source: unknown 30575 1726867636.31096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.31137: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867636.31147: variable 'omit' from source: magic vars 30575 1726867636.31152: starting attempt loop 30575 1726867636.31155: running the handler 30575 1726867636.31202: handler run complete 30575 1726867636.31214: attempt loop complete, returning result 30575 1726867636.31217: _execute() done 30575 1726867636.31220: dumping result to json 30575 1726867636.31222: done dumping result, returning 30575 1726867636.31230: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-e081-a588-00000000176e] 30575 1726867636.31235: sending task result for task 0affcac9-a3a5-e081-a588-00000000176e 30575 1726867636.31509: done sending task result for task 0affcac9-a3a5-e081-a588-00000000176e 30575 1726867636.31514: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30575 1726867636.31551: no more pending results, returning what we have 30575 1726867636.31554: results queue empty 30575 1726867636.31554: checking for any_errors_fatal 30575 1726867636.31559: done checking for any_errors_fatal 30575 1726867636.31560: checking for max_fail_percentage 30575 1726867636.31561: done checking for max_fail_percentage 30575 1726867636.31562: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.31563: done checking to see if all hosts have failed 30575 1726867636.31564: getting the remaining hosts for this loop 30575 1726867636.31565: done getting the remaining hosts for this loop 30575 1726867636.31568: getting the next task for host managed_node3 30575 1726867636.31575: done getting next task for host managed_node3 30575 1726867636.31580: ^ task is: TASK: Setup 30575 1726867636.31583: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.31587: getting variables 30575 1726867636.31588: in VariableManager get_vars() 30575 1726867636.31621: Calling all_inventory to load vars for managed_node3 30575 1726867636.31628: Calling groups_inventory to load vars for managed_node3 30575 1726867636.31631: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.31640: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.31642: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.31645: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.33174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.34809: done with get_vars() 30575 1726867636.34837: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:27:16 -0400 (0:00:00.057) 0:01:11.726 ****** 30575 1726867636.34944: entering _queue_task() for managed_node3/include_tasks 30575 1726867636.35493: worker is 1 (out of 1 available) 30575 1726867636.35504: exiting _queue_task() for managed_node3/include_tasks 30575 1726867636.35518: done queuing things up, now waiting for results queue to drain 30575 1726867636.35520: waiting for pending results... 30575 1726867636.35672: running TaskExecutor() for managed_node3/TASK: Setup 30575 1726867636.35799: in run() - task 0affcac9-a3a5-e081-a588-000000001747 30575 1726867636.35856: variable 'ansible_search_path' from source: unknown 30575 1726867636.35860: variable 'ansible_search_path' from source: unknown 30575 1726867636.35886: variable 'lsr_setup' from source: include params 30575 1726867636.36106: variable 'lsr_setup' from source: include params 30575 1726867636.36194: variable 'omit' from source: magic vars 30575 1726867636.36338: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.36401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.36405: variable 'omit' from source: magic vars 30575 1726867636.36633: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.36646: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.36656: variable 'item' from source: unknown 30575 1726867636.36734: variable 'item' from source: unknown 30575 1726867636.36771: variable 'item' from source: unknown 30575 1726867636.36845: variable 'item' from source: unknown 30575 1726867636.37281: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.37286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.37288: variable 'omit' from source: magic vars 30575 1726867636.37291: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.37293: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.37295: variable 'item' from source: unknown 30575 1726867636.37306: variable 'item' from source: unknown 30575 1726867636.37341: variable 'item' from source: unknown 30575 1726867636.37412: variable 'item' from source: unknown 30575 1726867636.37632: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.37638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.37641: variable 'omit' from source: magic vars 30575 1726867636.37736: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.37751: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.37762: variable 'item' from source: unknown 30575 1726867636.37827: variable 'item' from source: unknown 30575 1726867636.37868: variable 'item' from source: unknown 30575 1726867636.37934: variable 'item' from source: unknown 30575 1726867636.38067: dumping result to json 30575 1726867636.38070: done dumping result, returning 30575 1726867636.38073: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-e081-a588-000000001747] 30575 1726867636.38075: sending task result for task 0affcac9-a3a5-e081-a588-000000001747 30575 1726867636.38119: done sending task result for task 0affcac9-a3a5-e081-a588-000000001747 30575 1726867636.38123: WORKER PROCESS EXITING 30575 1726867636.38198: no more pending results, returning what we have 30575 1726867636.38204: in VariableManager get_vars() 30575 1726867636.38253: Calling all_inventory to load vars for managed_node3 30575 1726867636.38256: Calling groups_inventory to load vars for managed_node3 30575 1726867636.38260: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.38274: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.38280: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.38284: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.39954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.41584: done with get_vars() 30575 1726867636.41605: variable 'ansible_search_path' from source: unknown 30575 1726867636.41607: variable 'ansible_search_path' from source: unknown 30575 1726867636.41654: variable 'ansible_search_path' from source: unknown 30575 1726867636.41656: variable 'ansible_search_path' from source: unknown 30575 1726867636.41688: variable 'ansible_search_path' from source: unknown 30575 1726867636.41689: variable 'ansible_search_path' from source: unknown 30575 1726867636.41718: we have included files to process 30575 1726867636.41720: generating all_blocks data 30575 1726867636.41721: done generating all_blocks data 30575 1726867636.41726: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867636.41727: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867636.41729: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867636.41968: done processing included file 30575 1726867636.41970: iterating over new_blocks loaded from include file 30575 1726867636.41976: in VariableManager get_vars() 30575 1726867636.41993: done with get_vars() 30575 1726867636.41995: filtering new block on tags 30575 1726867636.42033: done filtering new block on tags 30575 1726867636.42036: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30575 1726867636.42040: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867636.42041: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867636.42044: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867636.42142: done processing included file 30575 1726867636.42143: iterating over new_blocks loaded from include file 30575 1726867636.42145: in VariableManager get_vars() 30575 1726867636.42159: done with get_vars() 30575 1726867636.42160: filtering new block on tags 30575 1726867636.42183: done filtering new block on tags 30575 1726867636.42185: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30575 1726867636.42193: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30575 1726867636.42194: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30575 1726867636.42196: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30575 1726867636.42283: done processing included file 30575 1726867636.42285: iterating over new_blocks loaded from include file 30575 1726867636.42286: in VariableManager get_vars() 30575 1726867636.42304: done with get_vars() 30575 1726867636.42306: filtering new block on tags 30575 1726867636.42330: done filtering new block on tags 30575 1726867636.42332: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node3 => (item=tasks/remove_profile.yml) 30575 1726867636.42335: extending task lists for all hosts with included blocks 30575 1726867636.43182: done extending task lists 30575 1726867636.43189: done processing included files 30575 1726867636.43190: results queue empty 30575 1726867636.43190: checking for any_errors_fatal 30575 1726867636.43194: done checking for any_errors_fatal 30575 1726867636.43194: checking for max_fail_percentage 30575 1726867636.43196: done checking for max_fail_percentage 30575 1726867636.43196: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.43197: done checking to see if all hosts have failed 30575 1726867636.43198: getting the remaining hosts for this loop 30575 1726867636.43199: done getting the remaining hosts for this loop 30575 1726867636.43202: getting the next task for host managed_node3 30575 1726867636.43206: done getting next task for host managed_node3 30575 1726867636.43208: ^ task is: TASK: Include network role 30575 1726867636.43214: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.43217: getting variables 30575 1726867636.43218: in VariableManager get_vars() 30575 1726867636.43228: Calling all_inventory to load vars for managed_node3 30575 1726867636.43230: Calling groups_inventory to load vars for managed_node3 30575 1726867636.43232: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.43237: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.43240: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.43243: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.44406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.46004: done with get_vars() 30575 1726867636.46026: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 17:27:16 -0400 (0:00:00.111) 0:01:11.838 ****** 30575 1726867636.46106: entering _queue_task() for managed_node3/include_role 30575 1726867636.46712: worker is 1 (out of 1 available) 30575 1726867636.46721: exiting _queue_task() for managed_node3/include_role 30575 1726867636.46734: done queuing things up, now waiting for results queue to drain 30575 1726867636.46735: waiting for pending results... 30575 1726867636.46868: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867636.46951: in run() - task 0affcac9-a3a5-e081-a588-0000000017d0 30575 1726867636.46988: variable 'ansible_search_path' from source: unknown 30575 1726867636.46995: variable 'ansible_search_path' from source: unknown 30575 1726867636.47039: calling self._execute() 30575 1726867636.47181: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.47185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.47188: variable 'omit' from source: magic vars 30575 1726867636.47581: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.47599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.47618: _execute() done 30575 1726867636.47641: dumping result to json 30575 1726867636.47645: done dumping result, returning 30575 1726867636.47720: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-0000000017d0] 30575 1726867636.47723: sending task result for task 0affcac9-a3a5-e081-a588-0000000017d0 30575 1726867636.47906: no more pending results, returning what we have 30575 1726867636.47915: in VariableManager get_vars() 30575 1726867636.47962: Calling all_inventory to load vars for managed_node3 30575 1726867636.47964: Calling groups_inventory to load vars for managed_node3 30575 1726867636.47969: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.47985: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.47989: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.47992: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.48598: done sending task result for task 0affcac9-a3a5-e081-a588-0000000017d0 30575 1726867636.48603: WORKER PROCESS EXITING 30575 1726867636.49853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.51499: done with get_vars() 30575 1726867636.51520: variable 'ansible_search_path' from source: unknown 30575 1726867636.51522: variable 'ansible_search_path' from source: unknown 30575 1726867636.51730: variable 'omit' from source: magic vars 30575 1726867636.51769: variable 'omit' from source: magic vars 30575 1726867636.51785: variable 'omit' from source: magic vars 30575 1726867636.51789: we have included files to process 30575 1726867636.51790: generating all_blocks data 30575 1726867636.51792: done generating all_blocks data 30575 1726867636.51793: processing included file: fedora.linux_system_roles.network 30575 1726867636.51819: in VariableManager get_vars() 30575 1726867636.51832: done with get_vars() 30575 1726867636.51857: in VariableManager get_vars() 30575 1726867636.51872: done with get_vars() 30575 1726867636.51922: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867636.52089: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867636.52290: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867636.53296: in VariableManager get_vars() 30575 1726867636.53319: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867636.55419: iterating over new_blocks loaded from include file 30575 1726867636.55421: in VariableManager get_vars() 30575 1726867636.55443: done with get_vars() 30575 1726867636.55445: filtering new block on tags 30575 1726867636.55795: done filtering new block on tags 30575 1726867636.55799: in VariableManager get_vars() 30575 1726867636.55816: done with get_vars() 30575 1726867636.55818: filtering new block on tags 30575 1726867636.55834: done filtering new block on tags 30575 1726867636.55836: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867636.55842: extending task lists for all hosts with included blocks 30575 1726867636.56016: done extending task lists 30575 1726867636.56017: done processing included files 30575 1726867636.56018: results queue empty 30575 1726867636.56019: checking for any_errors_fatal 30575 1726867636.56023: done checking for any_errors_fatal 30575 1726867636.56023: checking for max_fail_percentage 30575 1726867636.56024: done checking for max_fail_percentage 30575 1726867636.56025: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.56026: done checking to see if all hosts have failed 30575 1726867636.56027: getting the remaining hosts for this loop 30575 1726867636.56028: done getting the remaining hosts for this loop 30575 1726867636.56030: getting the next task for host managed_node3 30575 1726867636.56035: done getting next task for host managed_node3 30575 1726867636.56037: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867636.56041: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.56051: getting variables 30575 1726867636.56052: in VariableManager get_vars() 30575 1726867636.56065: Calling all_inventory to load vars for managed_node3 30575 1726867636.56067: Calling groups_inventory to load vars for managed_node3 30575 1726867636.56069: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.56075: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.56081: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.56085: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.57327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.59069: done with get_vars() 30575 1726867636.59094: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:16 -0400 (0:00:00.130) 0:01:11.969 ****** 30575 1726867636.59173: entering _queue_task() for managed_node3/include_tasks 30575 1726867636.59909: worker is 1 (out of 1 available) 30575 1726867636.59923: exiting _queue_task() for managed_node3/include_tasks 30575 1726867636.59936: done queuing things up, now waiting for results queue to drain 30575 1726867636.59938: waiting for pending results... 30575 1726867636.60617: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867636.60798: in run() - task 0affcac9-a3a5-e081-a588-00000000183a 30575 1726867636.60824: variable 'ansible_search_path' from source: unknown 30575 1726867636.60832: variable 'ansible_search_path' from source: unknown 30575 1726867636.60872: calling self._execute() 30575 1726867636.61145: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.61160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.61174: variable 'omit' from source: magic vars 30575 1726867636.61902: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.62019: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.62023: _execute() done 30575 1726867636.62026: dumping result to json 30575 1726867636.62028: done dumping result, returning 30575 1726867636.62083: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-00000000183a] 30575 1726867636.62086: sending task result for task 0affcac9-a3a5-e081-a588-00000000183a 30575 1726867636.62424: done sending task result for task 0affcac9-a3a5-e081-a588-00000000183a 30575 1726867636.62427: WORKER PROCESS EXITING 30575 1726867636.62479: no more pending results, returning what we have 30575 1726867636.62485: in VariableManager get_vars() 30575 1726867636.62540: Calling all_inventory to load vars for managed_node3 30575 1726867636.62543: Calling groups_inventory to load vars for managed_node3 30575 1726867636.62545: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.62559: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.62563: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.62566: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.65491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.68039: done with get_vars() 30575 1726867636.68065: variable 'ansible_search_path' from source: unknown 30575 1726867636.68066: variable 'ansible_search_path' from source: unknown 30575 1726867636.68107: we have included files to process 30575 1726867636.68108: generating all_blocks data 30575 1726867636.68113: done generating all_blocks data 30575 1726867636.68116: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867636.68117: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867636.68119: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867636.68751: done processing included file 30575 1726867636.68753: iterating over new_blocks loaded from include file 30575 1726867636.68755: in VariableManager get_vars() 30575 1726867636.68786: done with get_vars() 30575 1726867636.68789: filtering new block on tags 30575 1726867636.68822: done filtering new block on tags 30575 1726867636.68825: in VariableManager get_vars() 30575 1726867636.68846: done with get_vars() 30575 1726867636.68848: filtering new block on tags 30575 1726867636.68896: done filtering new block on tags 30575 1726867636.68898: in VariableManager get_vars() 30575 1726867636.68922: done with get_vars() 30575 1726867636.68923: filtering new block on tags 30575 1726867636.68962: done filtering new block on tags 30575 1726867636.68964: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867636.68970: extending task lists for all hosts with included blocks 30575 1726867636.70758: done extending task lists 30575 1726867636.70760: done processing included files 30575 1726867636.70761: results queue empty 30575 1726867636.70761: checking for any_errors_fatal 30575 1726867636.70765: done checking for any_errors_fatal 30575 1726867636.70765: checking for max_fail_percentage 30575 1726867636.70766: done checking for max_fail_percentage 30575 1726867636.70767: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.70768: done checking to see if all hosts have failed 30575 1726867636.70769: getting the remaining hosts for this loop 30575 1726867636.70770: done getting the remaining hosts for this loop 30575 1726867636.70773: getting the next task for host managed_node3 30575 1726867636.70779: done getting next task for host managed_node3 30575 1726867636.70782: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867636.70786: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.70796: getting variables 30575 1726867636.70797: in VariableManager get_vars() 30575 1726867636.70816: Calling all_inventory to load vars for managed_node3 30575 1726867636.70819: Calling groups_inventory to load vars for managed_node3 30575 1726867636.70821: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.70826: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.70828: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.70831: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.78651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.80395: done with get_vars() 30575 1726867636.80420: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:16 -0400 (0:00:00.213) 0:01:12.182 ****** 30575 1726867636.80492: entering _queue_task() for managed_node3/setup 30575 1726867636.80992: worker is 1 (out of 1 available) 30575 1726867636.81006: exiting _queue_task() for managed_node3/setup 30575 1726867636.81021: done queuing things up, now waiting for results queue to drain 30575 1726867636.81022: waiting for pending results... 30575 1726867636.81299: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867636.81450: in run() - task 0affcac9-a3a5-e081-a588-000000001897 30575 1726867636.81471: variable 'ansible_search_path' from source: unknown 30575 1726867636.81484: variable 'ansible_search_path' from source: unknown 30575 1726867636.81536: calling self._execute() 30575 1726867636.81643: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.81660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.81676: variable 'omit' from source: magic vars 30575 1726867636.82095: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.82159: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.82351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867636.86225: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867636.86459: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867636.86573: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867636.86622: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867636.86655: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867636.86859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867636.86899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867636.86933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867636.87172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867636.87176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867636.87180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867636.87262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867636.87320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867636.87417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867636.87439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867636.87620: variable '__network_required_facts' from source: role '' defaults 30575 1726867636.87636: variable 'ansible_facts' from source: unknown 30575 1726867636.88493: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867636.88503: when evaluation is False, skipping this task 30575 1726867636.88514: _execute() done 30575 1726867636.88582: dumping result to json 30575 1726867636.88587: done dumping result, returning 30575 1726867636.88590: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000001897] 30575 1726867636.88598: sending task result for task 0affcac9-a3a5-e081-a588-000000001897 30575 1726867636.88679: done sending task result for task 0affcac9-a3a5-e081-a588-000000001897 30575 1726867636.88682: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867636.88746: no more pending results, returning what we have 30575 1726867636.88750: results queue empty 30575 1726867636.88750: checking for any_errors_fatal 30575 1726867636.88752: done checking for any_errors_fatal 30575 1726867636.88752: checking for max_fail_percentage 30575 1726867636.88754: done checking for max_fail_percentage 30575 1726867636.88755: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.88756: done checking to see if all hosts have failed 30575 1726867636.88757: getting the remaining hosts for this loop 30575 1726867636.88758: done getting the remaining hosts for this loop 30575 1726867636.88762: getting the next task for host managed_node3 30575 1726867636.88779: done getting next task for host managed_node3 30575 1726867636.88783: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867636.88789: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.88815: getting variables 30575 1726867636.88817: in VariableManager get_vars() 30575 1726867636.88864: Calling all_inventory to load vars for managed_node3 30575 1726867636.88867: Calling groups_inventory to load vars for managed_node3 30575 1726867636.88870: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.89087: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.89091: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.89101: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.91755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.93414: done with get_vars() 30575 1726867636.93437: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:16 -0400 (0:00:00.130) 0:01:12.313 ****** 30575 1726867636.93551: entering _queue_task() for managed_node3/stat 30575 1726867636.93925: worker is 1 (out of 1 available) 30575 1726867636.93945: exiting _queue_task() for managed_node3/stat 30575 1726867636.93960: done queuing things up, now waiting for results queue to drain 30575 1726867636.93962: waiting for pending results... 30575 1726867636.94399: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867636.94404: in run() - task 0affcac9-a3a5-e081-a588-000000001899 30575 1726867636.94407: variable 'ansible_search_path' from source: unknown 30575 1726867636.94413: variable 'ansible_search_path' from source: unknown 30575 1726867636.94434: calling self._execute() 30575 1726867636.94532: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867636.94538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867636.94553: variable 'omit' from source: magic vars 30575 1726867636.95012: variable 'ansible_distribution_major_version' from source: facts 30575 1726867636.95021: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867636.95224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867636.95526: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867636.95683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867636.95688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867636.95706: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867636.95806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867636.95829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867636.95860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867636.95886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867636.95979: variable '__network_is_ostree' from source: set_fact 30575 1726867636.95985: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867636.95988: when evaluation is False, skipping this task 30575 1726867636.95991: _execute() done 30575 1726867636.95994: dumping result to json 30575 1726867636.95999: done dumping result, returning 30575 1726867636.96007: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000001899] 30575 1726867636.96081: sending task result for task 0affcac9-a3a5-e081-a588-000000001899 30575 1726867636.96147: done sending task result for task 0affcac9-a3a5-e081-a588-000000001899 30575 1726867636.96149: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867636.96200: no more pending results, returning what we have 30575 1726867636.96203: results queue empty 30575 1726867636.96204: checking for any_errors_fatal 30575 1726867636.96212: done checking for any_errors_fatal 30575 1726867636.96213: checking for max_fail_percentage 30575 1726867636.96215: done checking for max_fail_percentage 30575 1726867636.96216: checking to see if all hosts have failed and the running result is not ok 30575 1726867636.96217: done checking to see if all hosts have failed 30575 1726867636.96217: getting the remaining hosts for this loop 30575 1726867636.96219: done getting the remaining hosts for this loop 30575 1726867636.96222: getting the next task for host managed_node3 30575 1726867636.96231: done getting next task for host managed_node3 30575 1726867636.96234: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867636.96239: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867636.96256: getting variables 30575 1726867636.96258: in VariableManager get_vars() 30575 1726867636.96293: Calling all_inventory to load vars for managed_node3 30575 1726867636.96295: Calling groups_inventory to load vars for managed_node3 30575 1726867636.96297: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867636.96306: Calling all_plugins_play to load vars for managed_node3 30575 1726867636.96309: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867636.96314: Calling groups_plugins_play to load vars for managed_node3 30575 1726867636.97882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867636.99579: done with get_vars() 30575 1726867636.99601: done getting variables 30575 1726867636.99666: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:16 -0400 (0:00:00.061) 0:01:12.374 ****** 30575 1726867636.99705: entering _queue_task() for managed_node3/set_fact 30575 1726867637.00048: worker is 1 (out of 1 available) 30575 1726867637.00061: exiting _queue_task() for managed_node3/set_fact 30575 1726867637.00073: done queuing things up, now waiting for results queue to drain 30575 1726867637.00075: waiting for pending results... 30575 1726867637.00594: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867637.00601: in run() - task 0affcac9-a3a5-e081-a588-00000000189a 30575 1726867637.00604: variable 'ansible_search_path' from source: unknown 30575 1726867637.00608: variable 'ansible_search_path' from source: unknown 30575 1726867637.00613: calling self._execute() 30575 1726867637.00703: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867637.00707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867637.00725: variable 'omit' from source: magic vars 30575 1726867637.01118: variable 'ansible_distribution_major_version' from source: facts 30575 1726867637.01129: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867637.01309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867637.01590: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867637.01642: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867637.01718: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867637.01751: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867637.01843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867637.01866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867637.01891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867637.01916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867637.02013: variable '__network_is_ostree' from source: set_fact 30575 1726867637.02017: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867637.02020: when evaluation is False, skipping this task 30575 1726867637.02022: _execute() done 30575 1726867637.02281: dumping result to json 30575 1726867637.02284: done dumping result, returning 30575 1726867637.02287: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-00000000189a] 30575 1726867637.02289: sending task result for task 0affcac9-a3a5-e081-a588-00000000189a 30575 1726867637.02349: done sending task result for task 0affcac9-a3a5-e081-a588-00000000189a 30575 1726867637.02353: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867637.02393: no more pending results, returning what we have 30575 1726867637.02396: results queue empty 30575 1726867637.02397: checking for any_errors_fatal 30575 1726867637.02401: done checking for any_errors_fatal 30575 1726867637.02402: checking for max_fail_percentage 30575 1726867637.02404: done checking for max_fail_percentage 30575 1726867637.02404: checking to see if all hosts have failed and the running result is not ok 30575 1726867637.02406: done checking to see if all hosts have failed 30575 1726867637.02406: getting the remaining hosts for this loop 30575 1726867637.02407: done getting the remaining hosts for this loop 30575 1726867637.02411: getting the next task for host managed_node3 30575 1726867637.02421: done getting next task for host managed_node3 30575 1726867637.02425: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867637.02430: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867637.02446: getting variables 30575 1726867637.02448: in VariableManager get_vars() 30575 1726867637.02484: Calling all_inventory to load vars for managed_node3 30575 1726867637.02487: Calling groups_inventory to load vars for managed_node3 30575 1726867637.02489: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867637.02497: Calling all_plugins_play to load vars for managed_node3 30575 1726867637.02500: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867637.02503: Calling groups_plugins_play to load vars for managed_node3 30575 1726867637.03846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867637.05652: done with get_vars() 30575 1726867637.05673: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:17 -0400 (0:00:00.060) 0:01:12.435 ****** 30575 1726867637.05769: entering _queue_task() for managed_node3/service_facts 30575 1726867637.06039: worker is 1 (out of 1 available) 30575 1726867637.06051: exiting _queue_task() for managed_node3/service_facts 30575 1726867637.06183: done queuing things up, now waiting for results queue to drain 30575 1726867637.06185: waiting for pending results... 30575 1726867637.06363: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867637.06531: in run() - task 0affcac9-a3a5-e081-a588-00000000189c 30575 1726867637.06545: variable 'ansible_search_path' from source: unknown 30575 1726867637.06549: variable 'ansible_search_path' from source: unknown 30575 1726867637.06582: calling self._execute() 30575 1726867637.06673: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867637.06679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867637.06691: variable 'omit' from source: magic vars 30575 1726867637.07069: variable 'ansible_distribution_major_version' from source: facts 30575 1726867637.07081: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867637.07087: variable 'omit' from source: magic vars 30575 1726867637.07178: variable 'omit' from source: magic vars 30575 1726867637.07213: variable 'omit' from source: magic vars 30575 1726867637.07247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867637.07292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867637.07313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867637.07326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867637.07338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867637.07367: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867637.07383: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867637.07386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867637.07478: Set connection var ansible_pipelining to False 30575 1726867637.07491: Set connection var ansible_shell_type to sh 30575 1726867637.07497: Set connection var ansible_shell_executable to /bin/sh 30575 1726867637.07582: Set connection var ansible_timeout to 10 30575 1726867637.07585: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867637.07587: Set connection var ansible_connection to ssh 30575 1726867637.07589: variable 'ansible_shell_executable' from source: unknown 30575 1726867637.07592: variable 'ansible_connection' from source: unknown 30575 1726867637.07595: variable 'ansible_module_compression' from source: unknown 30575 1726867637.07598: variable 'ansible_shell_type' from source: unknown 30575 1726867637.07601: variable 'ansible_shell_executable' from source: unknown 30575 1726867637.07603: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867637.07606: variable 'ansible_pipelining' from source: unknown 30575 1726867637.07612: variable 'ansible_timeout' from source: unknown 30575 1726867637.07615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867637.07770: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867637.07784: variable 'omit' from source: magic vars 30575 1726867637.07788: starting attempt loop 30575 1726867637.07790: running the handler 30575 1726867637.07804: _low_level_execute_command(): starting 30575 1726867637.07816: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867637.08618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867637.08684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867637.08687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867637.08725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867637.08785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867637.10476: stdout chunk (state=3): >>>/root <<< 30575 1726867637.10632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867637.10636: stdout chunk (state=3): >>><<< 30575 1726867637.10640: stderr chunk (state=3): >>><<< 30575 1726867637.10740: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867637.10744: _low_level_execute_command(): starting 30575 1726867637.10747: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509 `" && echo ansible-tmp-1726867637.1066039-34012-51826997082509="` echo /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509 `" ) && sleep 0' 30575 1726867637.11283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867637.11301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867637.11321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867637.11339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867637.11398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867637.11472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867637.11490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867637.11509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867637.11595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867637.13570: stdout chunk (state=3): >>>ansible-tmp-1726867637.1066039-34012-51826997082509=/root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509 <<< 30575 1726867637.13731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867637.13734: stdout chunk (state=3): >>><<< 30575 1726867637.13737: stderr chunk (state=3): >>><<< 30575 1726867637.13752: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867637.1066039-34012-51826997082509=/root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867637.13805: variable 'ansible_module_compression' from source: unknown 30575 1726867637.13943: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867637.13947: variable 'ansible_facts' from source: unknown 30575 1726867637.14010: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/AnsiballZ_service_facts.py 30575 1726867637.14201: Sending initial data 30575 1726867637.14204: Sent initial data (161 bytes) 30575 1726867637.14869: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867637.14919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867637.14966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867637.16538: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867637.16616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867637.16668: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp26kd05sz /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/AnsiballZ_service_facts.py <<< 30575 1726867637.16674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/AnsiballZ_service_facts.py" <<< 30575 1726867637.16717: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp26kd05sz" to remote "/root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/AnsiballZ_service_facts.py" <<< 30575 1726867637.17432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867637.17466: stderr chunk (state=3): >>><<< 30575 1726867637.17481: stdout chunk (state=3): >>><<< 30575 1726867637.17532: done transferring module to remote 30575 1726867637.17617: _low_level_execute_command(): starting 30575 1726867637.17621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/ /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/AnsiballZ_service_facts.py && sleep 0' 30575 1726867637.18185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867637.18204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867637.18302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867637.18333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867637.18349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867637.18375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867637.18444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867637.20193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867637.20220: stderr chunk (state=3): >>><<< 30575 1726867637.20223: stdout chunk (state=3): >>><<< 30575 1726867637.20242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867637.20245: _low_level_execute_command(): starting 30575 1726867637.20248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/AnsiballZ_service_facts.py && sleep 0' 30575 1726867637.20652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867637.20655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867637.20657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867637.20660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867637.20664: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867637.20702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867637.20715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867637.20776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867638.71389: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30575 1726867638.71456: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30575 1726867638.71495: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867638.73083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867638.73088: stdout chunk (state=3): >>><<< 30575 1726867638.73091: stderr chunk (state=3): >>><<< 30575 1726867638.73097: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867638.73824: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867638.73833: _low_level_execute_command(): starting 30575 1726867638.73838: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867637.1066039-34012-51826997082509/ > /dev/null 2>&1 && sleep 0' 30575 1726867638.74520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867638.74592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.74637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867638.74648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867638.74665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867638.74741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867638.76621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867638.76632: stdout chunk (state=3): >>><<< 30575 1726867638.76787: stderr chunk (state=3): >>><<< 30575 1726867638.76791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867638.76793: handler run complete 30575 1726867638.76860: variable 'ansible_facts' from source: unknown 30575 1726867638.77026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867638.77553: variable 'ansible_facts' from source: unknown 30575 1726867638.77699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867638.77902: attempt loop complete, returning result 30575 1726867638.77912: _execute() done 30575 1726867638.77918: dumping result to json 30575 1726867638.77990: done dumping result, returning 30575 1726867638.78006: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-00000000189c] 30575 1726867638.78017: sending task result for task 0affcac9-a3a5-e081-a588-00000000189c ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867638.79153: no more pending results, returning what we have 30575 1726867638.79156: results queue empty 30575 1726867638.79157: checking for any_errors_fatal 30575 1726867638.79162: done checking for any_errors_fatal 30575 1726867638.79163: checking for max_fail_percentage 30575 1726867638.79164: done checking for max_fail_percentage 30575 1726867638.79165: checking to see if all hosts have failed and the running result is not ok 30575 1726867638.79166: done checking to see if all hosts have failed 30575 1726867638.79167: getting the remaining hosts for this loop 30575 1726867638.79168: done getting the remaining hosts for this loop 30575 1726867638.79172: getting the next task for host managed_node3 30575 1726867638.79181: done getting next task for host managed_node3 30575 1726867638.79185: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867638.79191: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867638.79203: getting variables 30575 1726867638.79205: in VariableManager get_vars() 30575 1726867638.79239: Calling all_inventory to load vars for managed_node3 30575 1726867638.79242: Calling groups_inventory to load vars for managed_node3 30575 1726867638.79245: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867638.79256: Calling all_plugins_play to load vars for managed_node3 30575 1726867638.79259: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867638.79262: Calling groups_plugins_play to load vars for managed_node3 30575 1726867638.79792: done sending task result for task 0affcac9-a3a5-e081-a588-00000000189c 30575 1726867638.79796: WORKER PROCESS EXITING 30575 1726867638.80697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867638.82391: done with get_vars() 30575 1726867638.82412: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:18 -0400 (0:00:01.767) 0:01:14.202 ****** 30575 1726867638.82521: entering _queue_task() for managed_node3/package_facts 30575 1726867638.82848: worker is 1 (out of 1 available) 30575 1726867638.82972: exiting _queue_task() for managed_node3/package_facts 30575 1726867638.82987: done queuing things up, now waiting for results queue to drain 30575 1726867638.82989: waiting for pending results... 30575 1726867638.83184: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867638.83379: in run() - task 0affcac9-a3a5-e081-a588-00000000189d 30575 1726867638.83406: variable 'ansible_search_path' from source: unknown 30575 1726867638.83415: variable 'ansible_search_path' from source: unknown 30575 1726867638.83457: calling self._execute() 30575 1726867638.83560: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867638.83572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867638.83588: variable 'omit' from source: magic vars 30575 1726867638.83997: variable 'ansible_distribution_major_version' from source: facts 30575 1726867638.84014: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867638.84026: variable 'omit' from source: magic vars 30575 1726867638.84116: variable 'omit' from source: magic vars 30575 1726867638.84153: variable 'omit' from source: magic vars 30575 1726867638.84205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867638.84245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867638.84384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867638.84387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867638.84390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867638.84393: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867638.84396: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867638.84398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867638.84467: Set connection var ansible_pipelining to False 30575 1726867638.84475: Set connection var ansible_shell_type to sh 30575 1726867638.84492: Set connection var ansible_shell_executable to /bin/sh 30575 1726867638.84503: Set connection var ansible_timeout to 10 30575 1726867638.84518: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867638.84530: Set connection var ansible_connection to ssh 30575 1726867638.84558: variable 'ansible_shell_executable' from source: unknown 30575 1726867638.84567: variable 'ansible_connection' from source: unknown 30575 1726867638.84575: variable 'ansible_module_compression' from source: unknown 30575 1726867638.84586: variable 'ansible_shell_type' from source: unknown 30575 1726867638.84598: variable 'ansible_shell_executable' from source: unknown 30575 1726867638.84620: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867638.84623: variable 'ansible_pipelining' from source: unknown 30575 1726867638.84625: variable 'ansible_timeout' from source: unknown 30575 1726867638.84630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867638.84882: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867638.84887: variable 'omit' from source: magic vars 30575 1726867638.84890: starting attempt loop 30575 1726867638.84892: running the handler 30575 1726867638.84894: _low_level_execute_command(): starting 30575 1726867638.84901: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867638.85706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.85757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867638.85771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867638.85801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867638.85885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867638.87533: stdout chunk (state=3): >>>/root <<< 30575 1726867638.87703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867638.87706: stdout chunk (state=3): >>><<< 30575 1726867638.87709: stderr chunk (state=3): >>><<< 30575 1726867638.87784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867638.87788: _low_level_execute_command(): starting 30575 1726867638.87791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445 `" && echo ansible-tmp-1726867638.8773293-34074-41367544699445="` echo /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445 `" ) && sleep 0' 30575 1726867638.88412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867638.88429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867638.88559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867638.88564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.88586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867638.88603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867638.88625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867638.88703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867638.90590: stdout chunk (state=3): >>>ansible-tmp-1726867638.8773293-34074-41367544699445=/root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445 <<< 30575 1726867638.90695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867638.90723: stderr chunk (state=3): >>><<< 30575 1726867638.90725: stdout chunk (state=3): >>><<< 30575 1726867638.90734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867638.8773293-34074-41367544699445=/root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867638.90783: variable 'ansible_module_compression' from source: unknown 30575 1726867638.90814: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867638.90864: variable 'ansible_facts' from source: unknown 30575 1726867638.90988: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/AnsiballZ_package_facts.py 30575 1726867638.91080: Sending initial data 30575 1726867638.91084: Sent initial data (161 bytes) 30575 1726867638.91497: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867638.91501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.91503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867638.91506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.91542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867638.91560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867638.91605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867638.93131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867638.93138: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867638.93172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867638.93215: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvo0utou7 /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/AnsiballZ_package_facts.py <<< 30575 1726867638.93218: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/AnsiballZ_package_facts.py" <<< 30575 1726867638.93255: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvo0utou7" to remote "/root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/AnsiballZ_package_facts.py" <<< 30575 1726867638.93265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/AnsiballZ_package_facts.py" <<< 30575 1726867638.94331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867638.94366: stderr chunk (state=3): >>><<< 30575 1726867638.94369: stdout chunk (state=3): >>><<< 30575 1726867638.94405: done transferring module to remote 30575 1726867638.94416: _low_level_execute_command(): starting 30575 1726867638.94419: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/ /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/AnsiballZ_package_facts.py && sleep 0' 30575 1726867638.94845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867638.94848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867638.94850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.94852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867638.94855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.94940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867638.94946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867638.95022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867638.96745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867638.96765: stderr chunk (state=3): >>><<< 30575 1726867638.96768: stdout chunk (state=3): >>><<< 30575 1726867638.96784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867638.96787: _low_level_execute_command(): starting 30575 1726867638.96791: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/AnsiballZ_package_facts.py && sleep 0' 30575 1726867638.97185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867638.97188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.97190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867638.97192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867638.97194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867638.97240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867638.97244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867638.97298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867639.41421: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30575 1726867639.41451: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30575 1726867639.41456: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30575 1726867639.41488: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867639.41517: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30575 1726867639.41529: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30575 1726867639.41557: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30575 1726867639.41566: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30575 1726867639.41601: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867639.41611: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30575 1726867639.41622: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867639.43416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867639.43445: stderr chunk (state=3): >>><<< 30575 1726867639.43448: stdout chunk (state=3): >>><<< 30575 1726867639.43488: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867639.44684: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867639.44699: _low_level_execute_command(): starting 30575 1726867639.44711: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867638.8773293-34074-41367544699445/ > /dev/null 2>&1 && sleep 0' 30575 1726867639.45175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867639.45181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867639.45183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867639.45185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867639.45187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867639.45230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867639.45234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867639.45293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867639.47102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867639.47130: stderr chunk (state=3): >>><<< 30575 1726867639.47133: stdout chunk (state=3): >>><<< 30575 1726867639.47145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867639.47150: handler run complete 30575 1726867639.47667: variable 'ansible_facts' from source: unknown 30575 1726867639.47940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.49389: variable 'ansible_facts' from source: unknown 30575 1726867639.49638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.50020: attempt loop complete, returning result 30575 1726867639.50029: _execute() done 30575 1726867639.50032: dumping result to json 30575 1726867639.50152: done dumping result, returning 30575 1726867639.50160: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-00000000189d] 30575 1726867639.50164: sending task result for task 0affcac9-a3a5-e081-a588-00000000189d 30575 1726867639.52067: done sending task result for task 0affcac9-a3a5-e081-a588-00000000189d 30575 1726867639.52070: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867639.52230: no more pending results, returning what we have 30575 1726867639.52233: results queue empty 30575 1726867639.52234: checking for any_errors_fatal 30575 1726867639.52240: done checking for any_errors_fatal 30575 1726867639.52241: checking for max_fail_percentage 30575 1726867639.52242: done checking for max_fail_percentage 30575 1726867639.52243: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.52244: done checking to see if all hosts have failed 30575 1726867639.52245: getting the remaining hosts for this loop 30575 1726867639.52246: done getting the remaining hosts for this loop 30575 1726867639.52249: getting the next task for host managed_node3 30575 1726867639.52257: done getting next task for host managed_node3 30575 1726867639.52261: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867639.52266: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.52284: getting variables 30575 1726867639.52286: in VariableManager get_vars() 30575 1726867639.52323: Calling all_inventory to load vars for managed_node3 30575 1726867639.52326: Calling groups_inventory to load vars for managed_node3 30575 1726867639.52329: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.52338: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.52341: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.52344: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.53641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.54695: done with get_vars() 30575 1726867639.54715: done getting variables 30575 1726867639.54760: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:19 -0400 (0:00:00.722) 0:01:14.925 ****** 30575 1726867639.54789: entering _queue_task() for managed_node3/debug 30575 1726867639.55035: worker is 1 (out of 1 available) 30575 1726867639.55050: exiting _queue_task() for managed_node3/debug 30575 1726867639.55062: done queuing things up, now waiting for results queue to drain 30575 1726867639.55064: waiting for pending results... 30575 1726867639.55248: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867639.55349: in run() - task 0affcac9-a3a5-e081-a588-00000000183b 30575 1726867639.55360: variable 'ansible_search_path' from source: unknown 30575 1726867639.55363: variable 'ansible_search_path' from source: unknown 30575 1726867639.55393: calling self._execute() 30575 1726867639.55466: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.55469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.55480: variable 'omit' from source: magic vars 30575 1726867639.55919: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.55923: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.55925: variable 'omit' from source: magic vars 30575 1726867639.55927: variable 'omit' from source: magic vars 30575 1726867639.55967: variable 'network_provider' from source: set_fact 30575 1726867639.55996: variable 'omit' from source: magic vars 30575 1726867639.56045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867639.56096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867639.56122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867639.56143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867639.56168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867639.56204: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867639.56383: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.56386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.56388: Set connection var ansible_pipelining to False 30575 1726867639.56391: Set connection var ansible_shell_type to sh 30575 1726867639.56393: Set connection var ansible_shell_executable to /bin/sh 30575 1726867639.56395: Set connection var ansible_timeout to 10 30575 1726867639.56397: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867639.56399: Set connection var ansible_connection to ssh 30575 1726867639.56401: variable 'ansible_shell_executable' from source: unknown 30575 1726867639.56403: variable 'ansible_connection' from source: unknown 30575 1726867639.56406: variable 'ansible_module_compression' from source: unknown 30575 1726867639.56407: variable 'ansible_shell_type' from source: unknown 30575 1726867639.56412: variable 'ansible_shell_executable' from source: unknown 30575 1726867639.56414: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.56416: variable 'ansible_pipelining' from source: unknown 30575 1726867639.56418: variable 'ansible_timeout' from source: unknown 30575 1726867639.56420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.56579: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867639.56620: variable 'omit' from source: magic vars 30575 1726867639.56623: starting attempt loop 30575 1726867639.56626: running the handler 30575 1726867639.56671: handler run complete 30575 1726867639.56683: attempt loop complete, returning result 30575 1726867639.56686: _execute() done 30575 1726867639.56689: dumping result to json 30575 1726867639.56691: done dumping result, returning 30575 1726867639.56698: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-00000000183b] 30575 1726867639.56703: sending task result for task 0affcac9-a3a5-e081-a588-00000000183b ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867639.56863: no more pending results, returning what we have 30575 1726867639.56866: results queue empty 30575 1726867639.56867: checking for any_errors_fatal 30575 1726867639.56875: done checking for any_errors_fatal 30575 1726867639.56876: checking for max_fail_percentage 30575 1726867639.56879: done checking for max_fail_percentage 30575 1726867639.56880: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.56881: done checking to see if all hosts have failed 30575 1726867639.56882: getting the remaining hosts for this loop 30575 1726867639.56883: done getting the remaining hosts for this loop 30575 1726867639.56886: getting the next task for host managed_node3 30575 1726867639.56894: done getting next task for host managed_node3 30575 1726867639.56898: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867639.56903: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.56917: getting variables 30575 1726867639.56918: in VariableManager get_vars() 30575 1726867639.56952: Calling all_inventory to load vars for managed_node3 30575 1726867639.56954: Calling groups_inventory to load vars for managed_node3 30575 1726867639.56956: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.56964: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.56972: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.56975: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.57493: done sending task result for task 0affcac9-a3a5-e081-a588-00000000183b 30575 1726867639.57497: WORKER PROCESS EXITING 30575 1726867639.57800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.58739: done with get_vars() 30575 1726867639.58760: done getting variables 30575 1726867639.58818: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:19 -0400 (0:00:00.040) 0:01:14.966 ****** 30575 1726867639.58858: entering _queue_task() for managed_node3/fail 30575 1726867639.59136: worker is 1 (out of 1 available) 30575 1726867639.59149: exiting _queue_task() for managed_node3/fail 30575 1726867639.59162: done queuing things up, now waiting for results queue to drain 30575 1726867639.59164: waiting for pending results... 30575 1726867639.59596: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867639.59644: in run() - task 0affcac9-a3a5-e081-a588-00000000183c 30575 1726867639.59665: variable 'ansible_search_path' from source: unknown 30575 1726867639.59676: variable 'ansible_search_path' from source: unknown 30575 1726867639.59728: calling self._execute() 30575 1726867639.59829: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.59835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.59851: variable 'omit' from source: magic vars 30575 1726867639.60154: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.60164: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.60247: variable 'network_state' from source: role '' defaults 30575 1726867639.60259: Evaluated conditional (network_state != {}): False 30575 1726867639.60262: when evaluation is False, skipping this task 30575 1726867639.60264: _execute() done 30575 1726867639.60267: dumping result to json 30575 1726867639.60270: done dumping result, returning 30575 1726867639.60276: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-00000000183c] 30575 1726867639.60282: sending task result for task 0affcac9-a3a5-e081-a588-00000000183c 30575 1726867639.60367: done sending task result for task 0affcac9-a3a5-e081-a588-00000000183c 30575 1726867639.60370: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867639.60426: no more pending results, returning what we have 30575 1726867639.60429: results queue empty 30575 1726867639.60430: checking for any_errors_fatal 30575 1726867639.60434: done checking for any_errors_fatal 30575 1726867639.60435: checking for max_fail_percentage 30575 1726867639.60437: done checking for max_fail_percentage 30575 1726867639.60438: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.60439: done checking to see if all hosts have failed 30575 1726867639.60439: getting the remaining hosts for this loop 30575 1726867639.60440: done getting the remaining hosts for this loop 30575 1726867639.60444: getting the next task for host managed_node3 30575 1726867639.60450: done getting next task for host managed_node3 30575 1726867639.60454: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867639.60459: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.60475: getting variables 30575 1726867639.60476: in VariableManager get_vars() 30575 1726867639.60513: Calling all_inventory to load vars for managed_node3 30575 1726867639.60515: Calling groups_inventory to load vars for managed_node3 30575 1726867639.60517: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.60525: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.60527: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.60530: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.61257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.62708: done with get_vars() 30575 1726867639.62728: done getting variables 30575 1726867639.62782: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:19 -0400 (0:00:00.039) 0:01:15.005 ****** 30575 1726867639.62812: entering _queue_task() for managed_node3/fail 30575 1726867639.63052: worker is 1 (out of 1 available) 30575 1726867639.63066: exiting _queue_task() for managed_node3/fail 30575 1726867639.63080: done queuing things up, now waiting for results queue to drain 30575 1726867639.63082: waiting for pending results... 30575 1726867639.63345: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867639.63438: in run() - task 0affcac9-a3a5-e081-a588-00000000183d 30575 1726867639.63448: variable 'ansible_search_path' from source: unknown 30575 1726867639.63452: variable 'ansible_search_path' from source: unknown 30575 1726867639.63480: calling self._execute() 30575 1726867639.63553: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.63556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.63565: variable 'omit' from source: magic vars 30575 1726867639.63833: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.63843: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.63926: variable 'network_state' from source: role '' defaults 30575 1726867639.63936: Evaluated conditional (network_state != {}): False 30575 1726867639.63939: when evaluation is False, skipping this task 30575 1726867639.63942: _execute() done 30575 1726867639.63945: dumping result to json 30575 1726867639.63947: done dumping result, returning 30575 1726867639.63957: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-00000000183d] 30575 1726867639.63960: sending task result for task 0affcac9-a3a5-e081-a588-00000000183d 30575 1726867639.64047: done sending task result for task 0affcac9-a3a5-e081-a588-00000000183d 30575 1726867639.64050: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867639.64113: no more pending results, returning what we have 30575 1726867639.64116: results queue empty 30575 1726867639.64117: checking for any_errors_fatal 30575 1726867639.64122: done checking for any_errors_fatal 30575 1726867639.64123: checking for max_fail_percentage 30575 1726867639.64124: done checking for max_fail_percentage 30575 1726867639.64125: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.64126: done checking to see if all hosts have failed 30575 1726867639.64126: getting the remaining hosts for this loop 30575 1726867639.64128: done getting the remaining hosts for this loop 30575 1726867639.64130: getting the next task for host managed_node3 30575 1726867639.64136: done getting next task for host managed_node3 30575 1726867639.64140: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867639.64144: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.64161: getting variables 30575 1726867639.64162: in VariableManager get_vars() 30575 1726867639.64196: Calling all_inventory to load vars for managed_node3 30575 1726867639.64198: Calling groups_inventory to load vars for managed_node3 30575 1726867639.64199: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.64205: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.64207: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.64211: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.64943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.66197: done with get_vars() 30575 1726867639.66214: done getting variables 30575 1726867639.66252: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:19 -0400 (0:00:00.034) 0:01:15.040 ****** 30575 1726867639.66274: entering _queue_task() for managed_node3/fail 30575 1726867639.66467: worker is 1 (out of 1 available) 30575 1726867639.66483: exiting _queue_task() for managed_node3/fail 30575 1726867639.66496: done queuing things up, now waiting for results queue to drain 30575 1726867639.66497: waiting for pending results... 30575 1726867639.66666: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867639.66750: in run() - task 0affcac9-a3a5-e081-a588-00000000183e 30575 1726867639.66760: variable 'ansible_search_path' from source: unknown 30575 1726867639.66764: variable 'ansible_search_path' from source: unknown 30575 1726867639.66793: calling self._execute() 30575 1726867639.66864: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.66869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.66879: variable 'omit' from source: magic vars 30575 1726867639.67143: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.67152: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.67274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867639.68770: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867639.68819: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867639.68845: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867639.68873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867639.68897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867639.68953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.69242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.69262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.69290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.69301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.69369: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.69383: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867639.69459: variable 'ansible_distribution' from source: facts 30575 1726867639.69462: variable '__network_rh_distros' from source: role '' defaults 30575 1726867639.69469: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867639.69625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.69642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.69663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.69692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.69703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.69737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.69752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.69770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.69800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.69813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.69839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.69855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.69873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.69902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.69914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.70104: variable 'network_connections' from source: include params 30575 1726867639.70181: variable 'interface' from source: play vars 30575 1726867639.70184: variable 'interface' from source: play vars 30575 1726867639.70185: variable 'network_state' from source: role '' defaults 30575 1726867639.70217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867639.70333: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867639.70360: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867639.70396: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867639.70405: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867639.70441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867639.70458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867639.70480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.70497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867639.70531: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867639.70534: when evaluation is False, skipping this task 30575 1726867639.70537: _execute() done 30575 1726867639.70541: dumping result to json 30575 1726867639.70543: done dumping result, returning 30575 1726867639.70552: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-00000000183e] 30575 1726867639.70555: sending task result for task 0affcac9-a3a5-e081-a588-00000000183e 30575 1726867639.70632: done sending task result for task 0affcac9-a3a5-e081-a588-00000000183e 30575 1726867639.70635: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867639.70686: no more pending results, returning what we have 30575 1726867639.70690: results queue empty 30575 1726867639.70690: checking for any_errors_fatal 30575 1726867639.70695: done checking for any_errors_fatal 30575 1726867639.70696: checking for max_fail_percentage 30575 1726867639.70698: done checking for max_fail_percentage 30575 1726867639.70699: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.70700: done checking to see if all hosts have failed 30575 1726867639.70700: getting the remaining hosts for this loop 30575 1726867639.70702: done getting the remaining hosts for this loop 30575 1726867639.70706: getting the next task for host managed_node3 30575 1726867639.70714: done getting next task for host managed_node3 30575 1726867639.70718: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867639.70723: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.70742: getting variables 30575 1726867639.70744: in VariableManager get_vars() 30575 1726867639.70791: Calling all_inventory to load vars for managed_node3 30575 1726867639.70794: Calling groups_inventory to load vars for managed_node3 30575 1726867639.70796: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.70805: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.70807: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.70810: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.71747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.72594: done with get_vars() 30575 1726867639.72610: done getting variables 30575 1726867639.72650: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:19 -0400 (0:00:00.063) 0:01:15.104 ****** 30575 1726867639.72675: entering _queue_task() for managed_node3/dnf 30575 1726867639.72906: worker is 1 (out of 1 available) 30575 1726867639.72919: exiting _queue_task() for managed_node3/dnf 30575 1726867639.72931: done queuing things up, now waiting for results queue to drain 30575 1726867639.72933: waiting for pending results... 30575 1726867639.73116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867639.73220: in run() - task 0affcac9-a3a5-e081-a588-00000000183f 30575 1726867639.73232: variable 'ansible_search_path' from source: unknown 30575 1726867639.73235: variable 'ansible_search_path' from source: unknown 30575 1726867639.73267: calling self._execute() 30575 1726867639.73339: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.73343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.73352: variable 'omit' from source: magic vars 30575 1726867639.73624: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.73633: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.73769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867639.75265: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867639.75308: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867639.75339: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867639.75366: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867639.75388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867639.75445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.75476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.75495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.75523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.75534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.75616: variable 'ansible_distribution' from source: facts 30575 1726867639.75620: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.75632: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867639.75707: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867639.75791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.75808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.75827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.75851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.75862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.75896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.75910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.75928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.75951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.75961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.75989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.76009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.76028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.76051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.76062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.76167: variable 'network_connections' from source: include params 30575 1726867639.76178: variable 'interface' from source: play vars 30575 1726867639.76225: variable 'interface' from source: play vars 30575 1726867639.76273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867639.76384: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867639.76410: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867639.76438: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867639.76458: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867639.76489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867639.76505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867639.76528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.76549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867639.76591: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867639.76754: variable 'network_connections' from source: include params 30575 1726867639.76759: variable 'interface' from source: play vars 30575 1726867639.76803: variable 'interface' from source: play vars 30575 1726867639.76829: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867639.76832: when evaluation is False, skipping this task 30575 1726867639.76835: _execute() done 30575 1726867639.76837: dumping result to json 30575 1726867639.76841: done dumping result, returning 30575 1726867639.76849: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000183f] 30575 1726867639.76853: sending task result for task 0affcac9-a3a5-e081-a588-00000000183f 30575 1726867639.76939: done sending task result for task 0affcac9-a3a5-e081-a588-00000000183f 30575 1726867639.76942: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867639.77027: no more pending results, returning what we have 30575 1726867639.77031: results queue empty 30575 1726867639.77032: checking for any_errors_fatal 30575 1726867639.77038: done checking for any_errors_fatal 30575 1726867639.77038: checking for max_fail_percentage 30575 1726867639.77040: done checking for max_fail_percentage 30575 1726867639.77041: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.77042: done checking to see if all hosts have failed 30575 1726867639.77043: getting the remaining hosts for this loop 30575 1726867639.77044: done getting the remaining hosts for this loop 30575 1726867639.77048: getting the next task for host managed_node3 30575 1726867639.77055: done getting next task for host managed_node3 30575 1726867639.77059: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867639.77063: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.77084: getting variables 30575 1726867639.77085: in VariableManager get_vars() 30575 1726867639.77122: Calling all_inventory to load vars for managed_node3 30575 1726867639.77125: Calling groups_inventory to load vars for managed_node3 30575 1726867639.77127: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.77135: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.77138: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.77140: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.77918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.78795: done with get_vars() 30575 1726867639.78813: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867639.78864: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:19 -0400 (0:00:00.062) 0:01:15.166 ****** 30575 1726867639.78888: entering _queue_task() for managed_node3/yum 30575 1726867639.79112: worker is 1 (out of 1 available) 30575 1726867639.79126: exiting _queue_task() for managed_node3/yum 30575 1726867639.79138: done queuing things up, now waiting for results queue to drain 30575 1726867639.79140: waiting for pending results... 30575 1726867639.79316: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867639.79403: in run() - task 0affcac9-a3a5-e081-a588-000000001840 30575 1726867639.79482: variable 'ansible_search_path' from source: unknown 30575 1726867639.79487: variable 'ansible_search_path' from source: unknown 30575 1726867639.79490: calling self._execute() 30575 1726867639.79525: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.79528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.79538: variable 'omit' from source: magic vars 30575 1726867639.79821: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.79830: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.79949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867639.81694: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867639.81738: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867639.81766: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867639.81792: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867639.81814: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867639.81875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.81895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.81915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.81941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.81951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.82017: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.82034: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867639.82058: when evaluation is False, skipping this task 30575 1726867639.82062: _execute() done 30575 1726867639.82064: dumping result to json 30575 1726867639.82067: done dumping result, returning 30575 1726867639.82070: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001840] 30575 1726867639.82072: sending task result for task 0affcac9-a3a5-e081-a588-000000001840 30575 1726867639.82154: done sending task result for task 0affcac9-a3a5-e081-a588-000000001840 30575 1726867639.82158: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867639.82233: no more pending results, returning what we have 30575 1726867639.82236: results queue empty 30575 1726867639.82237: checking for any_errors_fatal 30575 1726867639.82242: done checking for any_errors_fatal 30575 1726867639.82242: checking for max_fail_percentage 30575 1726867639.82244: done checking for max_fail_percentage 30575 1726867639.82245: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.82246: done checking to see if all hosts have failed 30575 1726867639.82247: getting the remaining hosts for this loop 30575 1726867639.82248: done getting the remaining hosts for this loop 30575 1726867639.82252: getting the next task for host managed_node3 30575 1726867639.82259: done getting next task for host managed_node3 30575 1726867639.82263: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867639.82267: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.82286: getting variables 30575 1726867639.82287: in VariableManager get_vars() 30575 1726867639.82321: Calling all_inventory to load vars for managed_node3 30575 1726867639.82323: Calling groups_inventory to load vars for managed_node3 30575 1726867639.82326: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.82334: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.82336: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.82338: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.83202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.84488: done with get_vars() 30575 1726867639.84511: done getting variables 30575 1726867639.84571: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:19 -0400 (0:00:00.057) 0:01:15.223 ****** 30575 1726867639.84610: entering _queue_task() for managed_node3/fail 30575 1726867639.85021: worker is 1 (out of 1 available) 30575 1726867639.85032: exiting _queue_task() for managed_node3/fail 30575 1726867639.85043: done queuing things up, now waiting for results queue to drain 30575 1726867639.85045: waiting for pending results... 30575 1726867639.85287: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867639.85419: in run() - task 0affcac9-a3a5-e081-a588-000000001841 30575 1726867639.85438: variable 'ansible_search_path' from source: unknown 30575 1726867639.85582: variable 'ansible_search_path' from source: unknown 30575 1726867639.85586: calling self._execute() 30575 1726867639.85589: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.85591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.85605: variable 'omit' from source: magic vars 30575 1726867639.85972: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.85991: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.86112: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867639.86316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867639.88213: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867639.88256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867639.88285: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867639.88313: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867639.88333: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867639.88394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.88427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.88445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.88476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.88488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.88523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.88539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.88556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.88586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.88596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.88625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.88642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.88658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.88686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.88696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.88814: variable 'network_connections' from source: include params 30575 1726867639.88821: variable 'interface' from source: play vars 30575 1726867639.88867: variable 'interface' from source: play vars 30575 1726867639.88922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867639.89031: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867639.89058: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867639.89084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867639.89114: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867639.89139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867639.89154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867639.89171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.89192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867639.89268: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867639.89467: variable 'network_connections' from source: include params 30575 1726867639.89470: variable 'interface' from source: play vars 30575 1726867639.89522: variable 'interface' from source: play vars 30575 1726867639.89550: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867639.89564: when evaluation is False, skipping this task 30575 1726867639.89567: _execute() done 30575 1726867639.89570: dumping result to json 30575 1726867639.89573: done dumping result, returning 30575 1726867639.89580: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001841] 30575 1726867639.89603: sending task result for task 0affcac9-a3a5-e081-a588-000000001841 30575 1726867639.89685: done sending task result for task 0affcac9-a3a5-e081-a588-000000001841 30575 1726867639.89688: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867639.89735: no more pending results, returning what we have 30575 1726867639.89739: results queue empty 30575 1726867639.89739: checking for any_errors_fatal 30575 1726867639.89747: done checking for any_errors_fatal 30575 1726867639.89748: checking for max_fail_percentage 30575 1726867639.89749: done checking for max_fail_percentage 30575 1726867639.89750: checking to see if all hosts have failed and the running result is not ok 30575 1726867639.89751: done checking to see if all hosts have failed 30575 1726867639.89752: getting the remaining hosts for this loop 30575 1726867639.89753: done getting the remaining hosts for this loop 30575 1726867639.89757: getting the next task for host managed_node3 30575 1726867639.89765: done getting next task for host managed_node3 30575 1726867639.89768: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867639.89773: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867639.89795: getting variables 30575 1726867639.89796: in VariableManager get_vars() 30575 1726867639.89837: Calling all_inventory to load vars for managed_node3 30575 1726867639.89840: Calling groups_inventory to load vars for managed_node3 30575 1726867639.89842: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867639.89850: Calling all_plugins_play to load vars for managed_node3 30575 1726867639.89853: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867639.89855: Calling groups_plugins_play to load vars for managed_node3 30575 1726867639.91076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867639.92052: done with get_vars() 30575 1726867639.92070: done getting variables 30575 1726867639.92114: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:19 -0400 (0:00:00.075) 0:01:15.298 ****** 30575 1726867639.92138: entering _queue_task() for managed_node3/package 30575 1726867639.92355: worker is 1 (out of 1 available) 30575 1726867639.92368: exiting _queue_task() for managed_node3/package 30575 1726867639.92382: done queuing things up, now waiting for results queue to drain 30575 1726867639.92384: waiting for pending results... 30575 1726867639.92572: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867639.92698: in run() - task 0affcac9-a3a5-e081-a588-000000001842 30575 1726867639.92882: variable 'ansible_search_path' from source: unknown 30575 1726867639.92886: variable 'ansible_search_path' from source: unknown 30575 1726867639.92890: calling self._execute() 30575 1726867639.92933: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867639.92990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867639.93003: variable 'omit' from source: magic vars 30575 1726867639.93450: variable 'ansible_distribution_major_version' from source: facts 30575 1726867639.93467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867639.93672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867639.93957: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867639.94011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867639.94051: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867639.94127: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867639.94250: variable 'network_packages' from source: role '' defaults 30575 1726867639.94362: variable '__network_provider_setup' from source: role '' defaults 30575 1726867639.94376: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867639.94441: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867639.94456: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867639.94520: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867639.94684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867639.96591: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867639.96658: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867639.96702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867639.96785: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867639.96788: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867639.96860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.96893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.96922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.96965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.96981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.97183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.97187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.97189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.97191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.97194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.97347: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867639.97458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.97481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.97505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.97544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.97558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.97645: variable 'ansible_python' from source: facts 30575 1726867639.97662: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867639.97743: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867639.97818: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867639.97945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.97974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.98005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.98050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.98182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.98186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867639.98197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867639.98199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.98221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867639.98239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867639.98381: variable 'network_connections' from source: include params 30575 1726867639.98393: variable 'interface' from source: play vars 30575 1726867639.98495: variable 'interface' from source: play vars 30575 1726867639.98587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867639.98622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867639.98657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867639.98695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867639.98751: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867639.99044: variable 'network_connections' from source: include params 30575 1726867639.99055: variable 'interface' from source: play vars 30575 1726867639.99153: variable 'interface' from source: play vars 30575 1726867639.99211: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867639.99292: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867639.99586: variable 'network_connections' from source: include params 30575 1726867639.99596: variable 'interface' from source: play vars 30575 1726867639.99787: variable 'interface' from source: play vars 30575 1726867639.99790: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867639.99792: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867640.00067: variable 'network_connections' from source: include params 30575 1726867640.00079: variable 'interface' from source: play vars 30575 1726867640.00155: variable 'interface' from source: play vars 30575 1726867640.00225: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867640.00291: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867640.00305: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867640.00368: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867640.00576: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867640.00984: variable 'network_connections' from source: include params 30575 1726867640.00993: variable 'interface' from source: play vars 30575 1726867640.01050: variable 'interface' from source: play vars 30575 1726867640.01064: variable 'ansible_distribution' from source: facts 30575 1726867640.01072: variable '__network_rh_distros' from source: role '' defaults 30575 1726867640.01083: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.01115: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867640.01272: variable 'ansible_distribution' from source: facts 30575 1726867640.01285: variable '__network_rh_distros' from source: role '' defaults 30575 1726867640.01295: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.01307: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867640.01468: variable 'ansible_distribution' from source: facts 30575 1726867640.01480: variable '__network_rh_distros' from source: role '' defaults 30575 1726867640.01491: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.01565: variable 'network_provider' from source: set_fact 30575 1726867640.01568: variable 'ansible_facts' from source: unknown 30575 1726867640.02282: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867640.02292: when evaluation is False, skipping this task 30575 1726867640.02299: _execute() done 30575 1726867640.02306: dumping result to json 30575 1726867640.02324: done dumping result, returning 30575 1726867640.02328: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000001842] 30575 1726867640.02335: sending task result for task 0affcac9-a3a5-e081-a588-000000001842 30575 1726867640.02425: done sending task result for task 0affcac9-a3a5-e081-a588-000000001842 30575 1726867640.02432: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867640.02485: no more pending results, returning what we have 30575 1726867640.02488: results queue empty 30575 1726867640.02489: checking for any_errors_fatal 30575 1726867640.02497: done checking for any_errors_fatal 30575 1726867640.02497: checking for max_fail_percentage 30575 1726867640.02499: done checking for max_fail_percentage 30575 1726867640.02500: checking to see if all hosts have failed and the running result is not ok 30575 1726867640.02501: done checking to see if all hosts have failed 30575 1726867640.02502: getting the remaining hosts for this loop 30575 1726867640.02503: done getting the remaining hosts for this loop 30575 1726867640.02507: getting the next task for host managed_node3 30575 1726867640.02516: done getting next task for host managed_node3 30575 1726867640.02520: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867640.02524: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867640.02549: getting variables 30575 1726867640.02550: in VariableManager get_vars() 30575 1726867640.02598: Calling all_inventory to load vars for managed_node3 30575 1726867640.02600: Calling groups_inventory to load vars for managed_node3 30575 1726867640.02603: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867640.02611: Calling all_plugins_play to load vars for managed_node3 30575 1726867640.02614: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867640.02616: Calling groups_plugins_play to load vars for managed_node3 30575 1726867640.03468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867640.04337: done with get_vars() 30575 1726867640.04353: done getting variables 30575 1726867640.04397: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:20 -0400 (0:00:00.122) 0:01:15.421 ****** 30575 1726867640.04432: entering _queue_task() for managed_node3/package 30575 1726867640.04705: worker is 1 (out of 1 available) 30575 1726867640.04719: exiting _queue_task() for managed_node3/package 30575 1726867640.04732: done queuing things up, now waiting for results queue to drain 30575 1726867640.04734: waiting for pending results... 30575 1726867640.04922: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867640.05007: in run() - task 0affcac9-a3a5-e081-a588-000000001843 30575 1726867640.05019: variable 'ansible_search_path' from source: unknown 30575 1726867640.05023: variable 'ansible_search_path' from source: unknown 30575 1726867640.05053: calling self._execute() 30575 1726867640.05127: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867640.05130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867640.05139: variable 'omit' from source: magic vars 30575 1726867640.05405: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.05415: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867640.05684: variable 'network_state' from source: role '' defaults 30575 1726867640.05687: Evaluated conditional (network_state != {}): False 30575 1726867640.05689: when evaluation is False, skipping this task 30575 1726867640.05691: _execute() done 30575 1726867640.05694: dumping result to json 30575 1726867640.05696: done dumping result, returning 30575 1726867640.05699: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001843] 30575 1726867640.05701: sending task result for task 0affcac9-a3a5-e081-a588-000000001843 30575 1726867640.05766: done sending task result for task 0affcac9-a3a5-e081-a588-000000001843 30575 1726867640.05768: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867640.05835: no more pending results, returning what we have 30575 1726867640.05839: results queue empty 30575 1726867640.05840: checking for any_errors_fatal 30575 1726867640.05846: done checking for any_errors_fatal 30575 1726867640.05847: checking for max_fail_percentage 30575 1726867640.05849: done checking for max_fail_percentage 30575 1726867640.05850: checking to see if all hosts have failed and the running result is not ok 30575 1726867640.05851: done checking to see if all hosts have failed 30575 1726867640.05852: getting the remaining hosts for this loop 30575 1726867640.05853: done getting the remaining hosts for this loop 30575 1726867640.05857: getting the next task for host managed_node3 30575 1726867640.05867: done getting next task for host managed_node3 30575 1726867640.05870: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867640.05875: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867640.05904: getting variables 30575 1726867640.05906: in VariableManager get_vars() 30575 1726867640.05951: Calling all_inventory to load vars for managed_node3 30575 1726867640.05953: Calling groups_inventory to load vars for managed_node3 30575 1726867640.05956: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867640.05973: Calling all_plugins_play to load vars for managed_node3 30575 1726867640.05978: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867640.05982: Calling groups_plugins_play to load vars for managed_node3 30575 1726867640.06884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867640.07730: done with get_vars() 30575 1726867640.07744: done getting variables 30575 1726867640.07785: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:20 -0400 (0:00:00.033) 0:01:15.455 ****** 30575 1726867640.07808: entering _queue_task() for managed_node3/package 30575 1726867640.08017: worker is 1 (out of 1 available) 30575 1726867640.08029: exiting _queue_task() for managed_node3/package 30575 1726867640.08043: done queuing things up, now waiting for results queue to drain 30575 1726867640.08044: waiting for pending results... 30575 1726867640.08217: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867640.08316: in run() - task 0affcac9-a3a5-e081-a588-000000001844 30575 1726867640.08326: variable 'ansible_search_path' from source: unknown 30575 1726867640.08330: variable 'ansible_search_path' from source: unknown 30575 1726867640.08358: calling self._execute() 30575 1726867640.08430: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867640.08435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867640.08443: variable 'omit' from source: magic vars 30575 1726867640.08706: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.08716: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867640.08797: variable 'network_state' from source: role '' defaults 30575 1726867640.08811: Evaluated conditional (network_state != {}): False 30575 1726867640.08815: when evaluation is False, skipping this task 30575 1726867640.08819: _execute() done 30575 1726867640.08822: dumping result to json 30575 1726867640.08825: done dumping result, returning 30575 1726867640.08828: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001844] 30575 1726867640.08830: sending task result for task 0affcac9-a3a5-e081-a588-000000001844 30575 1726867640.08923: done sending task result for task 0affcac9-a3a5-e081-a588-000000001844 30575 1726867640.08926: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867640.08969: no more pending results, returning what we have 30575 1726867640.08973: results queue empty 30575 1726867640.08974: checking for any_errors_fatal 30575 1726867640.08980: done checking for any_errors_fatal 30575 1726867640.08981: checking for max_fail_percentage 30575 1726867640.08982: done checking for max_fail_percentage 30575 1726867640.08983: checking to see if all hosts have failed and the running result is not ok 30575 1726867640.08984: done checking to see if all hosts have failed 30575 1726867640.08985: getting the remaining hosts for this loop 30575 1726867640.08986: done getting the remaining hosts for this loop 30575 1726867640.08989: getting the next task for host managed_node3 30575 1726867640.08995: done getting next task for host managed_node3 30575 1726867640.08998: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867640.09003: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867640.09022: getting variables 30575 1726867640.09023: in VariableManager get_vars() 30575 1726867640.09053: Calling all_inventory to load vars for managed_node3 30575 1726867640.09055: Calling groups_inventory to load vars for managed_node3 30575 1726867640.09056: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867640.09064: Calling all_plugins_play to load vars for managed_node3 30575 1726867640.09066: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867640.09068: Calling groups_plugins_play to load vars for managed_node3 30575 1726867640.09783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867640.10669: done with get_vars() 30575 1726867640.10694: done getting variables 30575 1726867640.10735: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:20 -0400 (0:00:00.029) 0:01:15.485 ****** 30575 1726867640.10759: entering _queue_task() for managed_node3/service 30575 1726867640.10954: worker is 1 (out of 1 available) 30575 1726867640.10967: exiting _queue_task() for managed_node3/service 30575 1726867640.10982: done queuing things up, now waiting for results queue to drain 30575 1726867640.10984: waiting for pending results... 30575 1726867640.11146: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867640.11234: in run() - task 0affcac9-a3a5-e081-a588-000000001845 30575 1726867640.11245: variable 'ansible_search_path' from source: unknown 30575 1726867640.11248: variable 'ansible_search_path' from source: unknown 30575 1726867640.11274: calling self._execute() 30575 1726867640.11343: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867640.11347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867640.11355: variable 'omit' from source: magic vars 30575 1726867640.11599: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.11607: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867640.11688: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867640.11816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867640.14082: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867640.14085: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867640.14087: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867640.14089: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867640.14091: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867640.14143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.14171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.14199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.14240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.14255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.14300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.14324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.14350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.14388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.14402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.14443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.14460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.14478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.14503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.14514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.14623: variable 'network_connections' from source: include params 30575 1726867640.14631: variable 'interface' from source: play vars 30575 1726867640.14676: variable 'interface' from source: play vars 30575 1726867640.14725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867640.19174: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867640.19211: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867640.19234: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867640.19256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867640.19291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867640.19306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867640.19325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.19346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867640.19385: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867640.19537: variable 'network_connections' from source: include params 30575 1726867640.19540: variable 'interface' from source: play vars 30575 1726867640.19589: variable 'interface' from source: play vars 30575 1726867640.19615: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867640.19618: when evaluation is False, skipping this task 30575 1726867640.19621: _execute() done 30575 1726867640.19623: dumping result to json 30575 1726867640.19625: done dumping result, returning 30575 1726867640.19628: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001845] 30575 1726867640.19633: sending task result for task 0affcac9-a3a5-e081-a588-000000001845 30575 1726867640.19719: done sending task result for task 0affcac9-a3a5-e081-a588-000000001845 30575 1726867640.19728: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867640.19768: no more pending results, returning what we have 30575 1726867640.19771: results queue empty 30575 1726867640.19772: checking for any_errors_fatal 30575 1726867640.19782: done checking for any_errors_fatal 30575 1726867640.19782: checking for max_fail_percentage 30575 1726867640.19784: done checking for max_fail_percentage 30575 1726867640.19785: checking to see if all hosts have failed and the running result is not ok 30575 1726867640.19786: done checking to see if all hosts have failed 30575 1726867640.19787: getting the remaining hosts for this loop 30575 1726867640.19788: done getting the remaining hosts for this loop 30575 1726867640.19792: getting the next task for host managed_node3 30575 1726867640.19799: done getting next task for host managed_node3 30575 1726867640.19803: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867640.19807: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867640.19827: getting variables 30575 1726867640.19829: in VariableManager get_vars() 30575 1726867640.19866: Calling all_inventory to load vars for managed_node3 30575 1726867640.19868: Calling groups_inventory to load vars for managed_node3 30575 1726867640.19870: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867640.19881: Calling all_plugins_play to load vars for managed_node3 30575 1726867640.19883: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867640.19886: Calling groups_plugins_play to load vars for managed_node3 30575 1726867640.25679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867640.27154: done with get_vars() 30575 1726867640.27183: done getting variables 30575 1726867640.27234: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:20 -0400 (0:00:00.165) 0:01:15.650 ****** 30575 1726867640.27263: entering _queue_task() for managed_node3/service 30575 1726867640.27623: worker is 1 (out of 1 available) 30575 1726867640.27635: exiting _queue_task() for managed_node3/service 30575 1726867640.27648: done queuing things up, now waiting for results queue to drain 30575 1726867640.27650: waiting for pending results... 30575 1726867640.28099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867640.28113: in run() - task 0affcac9-a3a5-e081-a588-000000001846 30575 1726867640.28133: variable 'ansible_search_path' from source: unknown 30575 1726867640.28140: variable 'ansible_search_path' from source: unknown 30575 1726867640.28181: calling self._execute() 30575 1726867640.28290: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867640.28413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867640.28417: variable 'omit' from source: magic vars 30575 1726867640.28717: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.28739: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867640.28923: variable 'network_provider' from source: set_fact 30575 1726867640.28934: variable 'network_state' from source: role '' defaults 30575 1726867640.28953: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867640.28963: variable 'omit' from source: magic vars 30575 1726867640.29034: variable 'omit' from source: magic vars 30575 1726867640.29068: variable 'network_service_name' from source: role '' defaults 30575 1726867640.29144: variable 'network_service_name' from source: role '' defaults 30575 1726867640.29264: variable '__network_provider_setup' from source: role '' defaults 30575 1726867640.29281: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867640.29347: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867640.29363: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867640.29427: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867640.29656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867640.32772: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867640.32971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867640.32974: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867640.32979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867640.32982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867640.33059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.33101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.33138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.33213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.33235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.33284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.33322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.33416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.33512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.33533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.33980: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867640.34111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.34140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.34168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.34217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.34236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.34339: variable 'ansible_python' from source: facts 30575 1726867640.34360: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867640.34452: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867640.34652: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867640.34675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.34711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.34741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.34793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.34816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.34872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867640.34907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867640.34934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.34969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867640.35085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867640.35238: variable 'network_connections' from source: include params 30575 1726867640.35248: variable 'interface' from source: play vars 30575 1726867640.35422: variable 'interface' from source: play vars 30575 1726867640.35738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867640.35979: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867640.36040: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867640.36090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867640.36146: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867640.36221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867640.36257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867640.36298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867640.36345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867640.36402: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867640.36712: variable 'network_connections' from source: include params 30575 1726867640.36725: variable 'interface' from source: play vars 30575 1726867640.36799: variable 'interface' from source: play vars 30575 1726867640.36847: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867640.36933: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867640.37225: variable 'network_connections' from source: include params 30575 1726867640.37235: variable 'interface' from source: play vars 30575 1726867640.37317: variable 'interface' from source: play vars 30575 1726867640.37345: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867640.37429: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867640.37732: variable 'network_connections' from source: include params 30575 1726867640.37742: variable 'interface' from source: play vars 30575 1726867640.37817: variable 'interface' from source: play vars 30575 1726867640.37887: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867640.37956: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867640.37967: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867640.38033: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867640.38267: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867640.38821: variable 'network_connections' from source: include params 30575 1726867640.38831: variable 'interface' from source: play vars 30575 1726867640.38897: variable 'interface' from source: play vars 30575 1726867640.38927: variable 'ansible_distribution' from source: facts 30575 1726867640.38930: variable '__network_rh_distros' from source: role '' defaults 30575 1726867640.39036: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.39039: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867640.39162: variable 'ansible_distribution' from source: facts 30575 1726867640.39171: variable '__network_rh_distros' from source: role '' defaults 30575 1726867640.39183: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.39199: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867640.39390: variable 'ansible_distribution' from source: facts 30575 1726867640.39400: variable '__network_rh_distros' from source: role '' defaults 30575 1726867640.39411: variable 'ansible_distribution_major_version' from source: facts 30575 1726867640.39449: variable 'network_provider' from source: set_fact 30575 1726867640.39483: variable 'omit' from source: magic vars 30575 1726867640.39517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867640.39550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867640.39576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867640.39605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867640.39626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867640.39664: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867640.39694: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867640.39697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867640.39805: Set connection var ansible_pipelining to False 30575 1726867640.39819: Set connection var ansible_shell_type to sh 30575 1726867640.39882: Set connection var ansible_shell_executable to /bin/sh 30575 1726867640.39885: Set connection var ansible_timeout to 10 30575 1726867640.39888: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867640.39890: Set connection var ansible_connection to ssh 30575 1726867640.39892: variable 'ansible_shell_executable' from source: unknown 30575 1726867640.39894: variable 'ansible_connection' from source: unknown 30575 1726867640.39900: variable 'ansible_module_compression' from source: unknown 30575 1726867640.39914: variable 'ansible_shell_type' from source: unknown 30575 1726867640.39922: variable 'ansible_shell_executable' from source: unknown 30575 1726867640.39930: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867640.39939: variable 'ansible_pipelining' from source: unknown 30575 1726867640.39945: variable 'ansible_timeout' from source: unknown 30575 1726867640.39953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867640.40063: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867640.40128: variable 'omit' from source: magic vars 30575 1726867640.40131: starting attempt loop 30575 1726867640.40134: running the handler 30575 1726867640.40185: variable 'ansible_facts' from source: unknown 30575 1726867640.41268: _low_level_execute_command(): starting 30575 1726867640.41283: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867640.42114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867640.42132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867640.42195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867640.42401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867640.42404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867640.42606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867640.44814: stdout chunk (state=3): >>>/root <<< 30575 1726867640.44818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867640.44821: stdout chunk (state=3): >>><<< 30575 1726867640.44823: stderr chunk (state=3): >>><<< 30575 1726867640.44826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867640.44829: _low_level_execute_command(): starting 30575 1726867640.44832: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315 `" && echo ansible-tmp-1726867640.4456425-34122-134752347090315="` echo /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315 `" ) && sleep 0' 30575 1726867640.45693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867640.45729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867640.45732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867640.45735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867640.45738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867640.45802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867640.45805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867640.45813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867640.45992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867640.48012: stdout chunk (state=3): >>>ansible-tmp-1726867640.4456425-34122-134752347090315=/root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315 <<< 30575 1726867640.48024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867640.48058: stderr chunk (state=3): >>><<< 30575 1726867640.48078: stdout chunk (state=3): >>><<< 30575 1726867640.48181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867640.4456425-34122-134752347090315=/root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867640.48216: variable 'ansible_module_compression' from source: unknown 30575 1726867640.48502: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867640.48505: variable 'ansible_facts' from source: unknown 30575 1726867640.48969: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/AnsiballZ_systemd.py 30575 1726867640.49592: Sending initial data 30575 1726867640.49595: Sent initial data (156 bytes) 30575 1726867640.50554: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867640.50702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867640.50864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867640.50890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867640.50952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867640.52541: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867640.52582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867640.52680: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3gcc_xcd /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/AnsiballZ_systemd.py <<< 30575 1726867640.52688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/AnsiballZ_systemd.py" <<< 30575 1726867640.52891: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3gcc_xcd" to remote "/root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/AnsiballZ_systemd.py" <<< 30575 1726867640.54715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867640.54755: stderr chunk (state=3): >>><<< 30575 1726867640.54764: stdout chunk (state=3): >>><<< 30575 1726867640.54827: done transferring module to remote 30575 1726867640.54848: _low_level_execute_command(): starting 30575 1726867640.54860: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/ /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/AnsiballZ_systemd.py && sleep 0' 30575 1726867640.55500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867640.55518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867640.55595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867640.55668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867640.55700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867640.56214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867640.57875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867640.57880: stdout chunk (state=3): >>><<< 30575 1726867640.57882: stderr chunk (state=3): >>><<< 30575 1726867640.57885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867640.57887: _low_level_execute_command(): starting 30575 1726867640.57889: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/AnsiballZ_systemd.py && sleep 0' 30575 1726867640.58397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867640.58412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867640.58426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867640.58447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867640.58463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867640.58472: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867640.58561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867640.58578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867640.58593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867640.58612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867640.58705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867640.87809: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10518528", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314851840", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1906606000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867640.89457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867640.89521: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867640.89531: stdout chunk (state=3): >>><<< 30575 1726867640.89549: stderr chunk (state=3): >>><<< 30575 1726867640.89567: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10518528", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314851840", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1906606000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867640.90085: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867640.90098: _low_level_execute_command(): starting 30575 1726867640.90111: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867640.4456425-34122-134752347090315/ > /dev/null 2>&1 && sleep 0' 30575 1726867640.91395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867640.91521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867640.91532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867640.91547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867640.91558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867640.91565: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867640.91626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867640.91719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867640.91736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867640.91744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867640.91822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867640.93834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867640.93838: stdout chunk (state=3): >>><<< 30575 1726867640.93844: stderr chunk (state=3): >>><<< 30575 1726867640.93860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867640.93867: handler run complete 30575 1726867640.94044: attempt loop complete, returning result 30575 1726867640.94053: _execute() done 30575 1726867640.94056: dumping result to json 30575 1726867640.94068: done dumping result, returning 30575 1726867640.94080: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000001846] 30575 1726867640.94085: sending task result for task 0affcac9-a3a5-e081-a588-000000001846 30575 1726867640.94685: done sending task result for task 0affcac9-a3a5-e081-a588-000000001846 30575 1726867640.94687: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867640.94741: no more pending results, returning what we have 30575 1726867640.94745: results queue empty 30575 1726867640.94746: checking for any_errors_fatal 30575 1726867640.94753: done checking for any_errors_fatal 30575 1726867640.94753: checking for max_fail_percentage 30575 1726867640.94755: done checking for max_fail_percentage 30575 1726867640.94756: checking to see if all hosts have failed and the running result is not ok 30575 1726867640.94757: done checking to see if all hosts have failed 30575 1726867640.94758: getting the remaining hosts for this loop 30575 1726867640.94760: done getting the remaining hosts for this loop 30575 1726867640.94763: getting the next task for host managed_node3 30575 1726867640.94772: done getting next task for host managed_node3 30575 1726867640.94776: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867640.94785: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867640.94799: getting variables 30575 1726867640.94800: in VariableManager get_vars() 30575 1726867640.94836: Calling all_inventory to load vars for managed_node3 30575 1726867640.94839: Calling groups_inventory to load vars for managed_node3 30575 1726867640.94841: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867640.94851: Calling all_plugins_play to load vars for managed_node3 30575 1726867640.94855: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867640.94858: Calling groups_plugins_play to load vars for managed_node3 30575 1726867640.97424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867640.99097: done with get_vars() 30575 1726867640.99119: done getting variables 30575 1726867640.99179: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:20 -0400 (0:00:00.719) 0:01:16.369 ****** 30575 1726867640.99217: entering _queue_task() for managed_node3/service 30575 1726867640.99955: worker is 1 (out of 1 available) 30575 1726867640.99970: exiting _queue_task() for managed_node3/service 30575 1726867640.99985: done queuing things up, now waiting for results queue to drain 30575 1726867640.99987: waiting for pending results... 30575 1726867641.00699: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867641.00836: in run() - task 0affcac9-a3a5-e081-a588-000000001847 30575 1726867641.00921: variable 'ansible_search_path' from source: unknown 30575 1726867641.00931: variable 'ansible_search_path' from source: unknown 30575 1726867641.00973: calling self._execute() 30575 1726867641.01257: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.01271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.01288: variable 'omit' from source: magic vars 30575 1726867641.01866: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.01902: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.02070: variable 'network_provider' from source: set_fact 30575 1726867641.02084: Evaluated conditional (network_provider == "nm"): True 30575 1726867641.02174: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867641.02445: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867641.02716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867641.05249: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867641.05318: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867641.05432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867641.05469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867641.05514: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867641.05885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867641.05889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867641.05892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867641.05914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867641.05938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867641.06183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867641.06187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867641.06190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867641.06291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867641.06314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867641.06360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867641.06442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867641.06473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867641.06580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867641.06602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867641.06903: variable 'network_connections' from source: include params 30575 1726867641.06951: variable 'interface' from source: play vars 30575 1726867641.07079: variable 'interface' from source: play vars 30575 1726867641.07106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867641.07282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867641.07325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867641.07360: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867641.07400: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867641.07450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867641.07480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867641.07514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867641.07583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867641.07603: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867641.07863: variable 'network_connections' from source: include params 30575 1726867641.07874: variable 'interface' from source: play vars 30575 1726867641.07938: variable 'interface' from source: play vars 30575 1726867641.07992: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867641.08066: when evaluation is False, skipping this task 30575 1726867641.08069: _execute() done 30575 1726867641.08072: dumping result to json 30575 1726867641.08074: done dumping result, returning 30575 1726867641.08076: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000001847] 30575 1726867641.08087: sending task result for task 0affcac9-a3a5-e081-a588-000000001847 30575 1726867641.08155: done sending task result for task 0affcac9-a3a5-e081-a588-000000001847 30575 1726867641.08158: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867641.08218: no more pending results, returning what we have 30575 1726867641.08221: results queue empty 30575 1726867641.08222: checking for any_errors_fatal 30575 1726867641.08244: done checking for any_errors_fatal 30575 1726867641.08245: checking for max_fail_percentage 30575 1726867641.08247: done checking for max_fail_percentage 30575 1726867641.08249: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.08250: done checking to see if all hosts have failed 30575 1726867641.08250: getting the remaining hosts for this loop 30575 1726867641.08252: done getting the remaining hosts for this loop 30575 1726867641.08257: getting the next task for host managed_node3 30575 1726867641.08265: done getting next task for host managed_node3 30575 1726867641.08269: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867641.08275: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.08300: getting variables 30575 1726867641.08302: in VariableManager get_vars() 30575 1726867641.08344: Calling all_inventory to load vars for managed_node3 30575 1726867641.08346: Calling groups_inventory to load vars for managed_node3 30575 1726867641.08349: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.08359: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.08362: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.08364: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.09954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.11510: done with get_vars() 30575 1726867641.11530: done getting variables 30575 1726867641.11587: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:21 -0400 (0:00:00.124) 0:01:16.493 ****** 30575 1726867641.11620: entering _queue_task() for managed_node3/service 30575 1726867641.11906: worker is 1 (out of 1 available) 30575 1726867641.11919: exiting _queue_task() for managed_node3/service 30575 1726867641.11934: done queuing things up, now waiting for results queue to drain 30575 1726867641.11936: waiting for pending results... 30575 1726867641.12303: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867641.12374: in run() - task 0affcac9-a3a5-e081-a588-000000001848 30575 1726867641.12402: variable 'ansible_search_path' from source: unknown 30575 1726867641.12412: variable 'ansible_search_path' from source: unknown 30575 1726867641.12452: calling self._execute() 30575 1726867641.12554: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.12619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.12623: variable 'omit' from source: magic vars 30575 1726867641.12954: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.12972: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.13098: variable 'network_provider' from source: set_fact 30575 1726867641.13110: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867641.13118: when evaluation is False, skipping this task 30575 1726867641.13125: _execute() done 30575 1726867641.13133: dumping result to json 30575 1726867641.13162: done dumping result, returning 30575 1726867641.13165: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000001848] 30575 1726867641.13168: sending task result for task 0affcac9-a3a5-e081-a588-000000001848 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867641.13420: no more pending results, returning what we have 30575 1726867641.13424: results queue empty 30575 1726867641.13424: checking for any_errors_fatal 30575 1726867641.13431: done checking for any_errors_fatal 30575 1726867641.13432: checking for max_fail_percentage 30575 1726867641.13434: done checking for max_fail_percentage 30575 1726867641.13435: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.13435: done checking to see if all hosts have failed 30575 1726867641.13436: getting the remaining hosts for this loop 30575 1726867641.13437: done getting the remaining hosts for this loop 30575 1726867641.13441: getting the next task for host managed_node3 30575 1726867641.13449: done getting next task for host managed_node3 30575 1726867641.13453: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867641.13458: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.13480: getting variables 30575 1726867641.13482: in VariableManager get_vars() 30575 1726867641.13521: Calling all_inventory to load vars for managed_node3 30575 1726867641.13523: Calling groups_inventory to load vars for managed_node3 30575 1726867641.13526: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.13537: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.13540: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.13543: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.14091: done sending task result for task 0affcac9-a3a5-e081-a588-000000001848 30575 1726867641.14094: WORKER PROCESS EXITING 30575 1726867641.15061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.16558: done with get_vars() 30575 1726867641.16581: done getting variables 30575 1726867641.16637: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:21 -0400 (0:00:00.050) 0:01:16.544 ****** 30575 1726867641.16670: entering _queue_task() for managed_node3/copy 30575 1726867641.16948: worker is 1 (out of 1 available) 30575 1726867641.16959: exiting _queue_task() for managed_node3/copy 30575 1726867641.16972: done queuing things up, now waiting for results queue to drain 30575 1726867641.16973: waiting for pending results... 30575 1726867641.17257: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867641.17398: in run() - task 0affcac9-a3a5-e081-a588-000000001849 30575 1726867641.17421: variable 'ansible_search_path' from source: unknown 30575 1726867641.17430: variable 'ansible_search_path' from source: unknown 30575 1726867641.17471: calling self._execute() 30575 1726867641.17579: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.17592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.17607: variable 'omit' from source: magic vars 30575 1726867641.17997: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.18013: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.18140: variable 'network_provider' from source: set_fact 30575 1726867641.18152: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867641.18165: when evaluation is False, skipping this task 30575 1726867641.18174: _execute() done 30575 1726867641.18185: dumping result to json 30575 1726867641.18194: done dumping result, returning 30575 1726867641.18208: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000001849] 30575 1726867641.18220: sending task result for task 0affcac9-a3a5-e081-a588-000000001849 30575 1726867641.18433: done sending task result for task 0affcac9-a3a5-e081-a588-000000001849 30575 1726867641.18437: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867641.18487: no more pending results, returning what we have 30575 1726867641.18492: results queue empty 30575 1726867641.18493: checking for any_errors_fatal 30575 1726867641.18501: done checking for any_errors_fatal 30575 1726867641.18502: checking for max_fail_percentage 30575 1726867641.18503: done checking for max_fail_percentage 30575 1726867641.18504: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.18505: done checking to see if all hosts have failed 30575 1726867641.18506: getting the remaining hosts for this loop 30575 1726867641.18508: done getting the remaining hosts for this loop 30575 1726867641.18511: getting the next task for host managed_node3 30575 1726867641.18520: done getting next task for host managed_node3 30575 1726867641.18525: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867641.18532: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.18553: getting variables 30575 1726867641.18555: in VariableManager get_vars() 30575 1726867641.18600: Calling all_inventory to load vars for managed_node3 30575 1726867641.18603: Calling groups_inventory to load vars for managed_node3 30575 1726867641.18606: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.18619: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.18623: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.18626: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.19992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.21542: done with get_vars() 30575 1726867641.21561: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:21 -0400 (0:00:00.049) 0:01:16.593 ****** 30575 1726867641.21641: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867641.21892: worker is 1 (out of 1 available) 30575 1726867641.21905: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867641.21917: done queuing things up, now waiting for results queue to drain 30575 1726867641.21919: waiting for pending results... 30575 1726867641.22295: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867641.22341: in run() - task 0affcac9-a3a5-e081-a588-00000000184a 30575 1726867641.22360: variable 'ansible_search_path' from source: unknown 30575 1726867641.22369: variable 'ansible_search_path' from source: unknown 30575 1726867641.22417: calling self._execute() 30575 1726867641.22516: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.22528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.22543: variable 'omit' from source: magic vars 30575 1726867641.22904: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.22920: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.22934: variable 'omit' from source: magic vars 30575 1726867641.23003: variable 'omit' from source: magic vars 30575 1726867641.23164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867641.25628: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867641.25701: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867641.25742: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867641.25870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867641.25873: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867641.25896: variable 'network_provider' from source: set_fact 30575 1726867641.26030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867641.26065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867641.26098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867641.26141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867641.26161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867641.26240: variable 'omit' from source: magic vars 30575 1726867641.26351: variable 'omit' from source: magic vars 30575 1726867641.26458: variable 'network_connections' from source: include params 30575 1726867641.26474: variable 'interface' from source: play vars 30575 1726867641.26544: variable 'interface' from source: play vars 30575 1726867641.26700: variable 'omit' from source: magic vars 30575 1726867641.26712: variable '__lsr_ansible_managed' from source: task vars 30575 1726867641.26842: variable '__lsr_ansible_managed' from source: task vars 30575 1726867641.26935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867641.27242: Loaded config def from plugin (lookup/template) 30575 1726867641.27245: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867641.27284: File lookup term: get_ansible_managed.j2 30575 1726867641.27287: variable 'ansible_search_path' from source: unknown 30575 1726867641.27291: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867641.27307: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867641.27324: variable 'ansible_search_path' from source: unknown 30575 1726867641.31900: variable 'ansible_managed' from source: unknown 30575 1726867641.32052: variable 'omit' from source: magic vars 30575 1726867641.32056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867641.32059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867641.32091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867641.32095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867641.32104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867641.32143: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867641.32147: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.32151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.32233: Set connection var ansible_pipelining to False 30575 1726867641.32236: Set connection var ansible_shell_type to sh 30575 1726867641.32241: Set connection var ansible_shell_executable to /bin/sh 30575 1726867641.32246: Set connection var ansible_timeout to 10 30575 1726867641.32251: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867641.32257: Set connection var ansible_connection to ssh 30575 1726867641.32278: variable 'ansible_shell_executable' from source: unknown 30575 1726867641.32281: variable 'ansible_connection' from source: unknown 30575 1726867641.32284: variable 'ansible_module_compression' from source: unknown 30575 1726867641.32286: variable 'ansible_shell_type' from source: unknown 30575 1726867641.32288: variable 'ansible_shell_executable' from source: unknown 30575 1726867641.32291: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.32293: variable 'ansible_pipelining' from source: unknown 30575 1726867641.32295: variable 'ansible_timeout' from source: unknown 30575 1726867641.32300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.32393: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867641.32404: variable 'omit' from source: magic vars 30575 1726867641.32407: starting attempt loop 30575 1726867641.32414: running the handler 30575 1726867641.32423: _low_level_execute_command(): starting 30575 1726867641.32429: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867641.32875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867641.32906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867641.32909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867641.32911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867641.32915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867641.32969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867641.32976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867641.32980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867641.33029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867641.34722: stdout chunk (state=3): >>>/root <<< 30575 1726867641.34885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867641.34889: stdout chunk (state=3): >>><<< 30575 1726867641.34891: stderr chunk (state=3): >>><<< 30575 1726867641.34912: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867641.34983: _low_level_execute_command(): starting 30575 1726867641.34989: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572 `" && echo ansible-tmp-1726867641.3491817-34165-270136693116572="` echo /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572 `" ) && sleep 0' 30575 1726867641.35573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867641.35590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867641.35606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867641.35647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867641.35661: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867641.35759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867641.35784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867641.35865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867641.37755: stdout chunk (state=3): >>>ansible-tmp-1726867641.3491817-34165-270136693116572=/root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572 <<< 30575 1726867641.37902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867641.37934: stdout chunk (state=3): >>><<< 30575 1726867641.37938: stderr chunk (state=3): >>><<< 30575 1726867641.37984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867641.3491817-34165-270136693116572=/root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867641.38012: variable 'ansible_module_compression' from source: unknown 30575 1726867641.38073: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867641.38112: variable 'ansible_facts' from source: unknown 30575 1726867641.38290: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/AnsiballZ_network_connections.py 30575 1726867641.38414: Sending initial data 30575 1726867641.38424: Sent initial data (168 bytes) 30575 1726867641.39058: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867641.39074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867641.39176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867641.39208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867641.39224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867641.39245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867641.39330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867641.40880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867641.40947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867641.40986: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpo6r5lv9e /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/AnsiballZ_network_connections.py <<< 30575 1726867641.41012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/AnsiballZ_network_connections.py" <<< 30575 1726867641.41047: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpo6r5lv9e" to remote "/root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/AnsiballZ_network_connections.py" <<< 30575 1726867641.42128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867641.42162: stderr chunk (state=3): >>><<< 30575 1726867641.42281: stdout chunk (state=3): >>><<< 30575 1726867641.42284: done transferring module to remote 30575 1726867641.42287: _low_level_execute_command(): starting 30575 1726867641.42289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/ /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/AnsiballZ_network_connections.py && sleep 0' 30575 1726867641.42852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867641.42891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867641.42904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867641.43001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867641.43013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867641.43043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867641.43120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867641.44909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867641.44912: stdout chunk (state=3): >>><<< 30575 1726867641.44914: stderr chunk (state=3): >>><<< 30575 1726867641.44938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867641.44946: _low_level_execute_command(): starting 30575 1726867641.45021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/AnsiballZ_network_connections.py && sleep 0' 30575 1726867641.45562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867641.45576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867641.45594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867641.45611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867641.45716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867641.45745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867641.45760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867641.45852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867641.73642: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867641.76253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867641.76260: stdout chunk (state=3): >>><<< 30575 1726867641.76262: stderr chunk (state=3): >>><<< 30575 1726867641.76322: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867641.76328: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867641.76356: _low_level_execute_command(): starting 30575 1726867641.76363: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867641.3491817-34165-270136693116572/ > /dev/null 2>&1 && sleep 0' 30575 1726867641.77190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867641.77194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867641.77197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867641.77199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867641.77202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867641.77267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867641.77317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867641.77436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867641.79205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867641.79234: stderr chunk (state=3): >>><<< 30575 1726867641.79237: stdout chunk (state=3): >>><<< 30575 1726867641.79251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867641.79258: handler run complete 30575 1726867641.79284: attempt loop complete, returning result 30575 1726867641.79287: _execute() done 30575 1726867641.79290: dumping result to json 30575 1726867641.79298: done dumping result, returning 30575 1726867641.79314: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-00000000184a] 30575 1726867641.79317: sending task result for task 0affcac9-a3a5-e081-a588-00000000184a 30575 1726867641.79440: done sending task result for task 0affcac9-a3a5-e081-a588-00000000184a 30575 1726867641.79443: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9 30575 1726867641.79583: no more pending results, returning what we have 30575 1726867641.79586: results queue empty 30575 1726867641.79587: checking for any_errors_fatal 30575 1726867641.79591: done checking for any_errors_fatal 30575 1726867641.79592: checking for max_fail_percentage 30575 1726867641.79593: done checking for max_fail_percentage 30575 1726867641.79594: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.79595: done checking to see if all hosts have failed 30575 1726867641.79596: getting the remaining hosts for this loop 30575 1726867641.79597: done getting the remaining hosts for this loop 30575 1726867641.79600: getting the next task for host managed_node3 30575 1726867641.79606: done getting next task for host managed_node3 30575 1726867641.79612: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867641.79616: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.79628: getting variables 30575 1726867641.79630: in VariableManager get_vars() 30575 1726867641.79668: Calling all_inventory to load vars for managed_node3 30575 1726867641.79671: Calling groups_inventory to load vars for managed_node3 30575 1726867641.79673: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.79783: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.79787: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.79790: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.81547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.82863: done with get_vars() 30575 1726867641.82895: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:21 -0400 (0:00:00.613) 0:01:17.207 ****** 30575 1726867641.82973: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867641.83311: worker is 1 (out of 1 available) 30575 1726867641.83326: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867641.83339: done queuing things up, now waiting for results queue to drain 30575 1726867641.83341: waiting for pending results... 30575 1726867641.83555: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867641.83659: in run() - task 0affcac9-a3a5-e081-a588-00000000184b 30575 1726867641.83675: variable 'ansible_search_path' from source: unknown 30575 1726867641.83680: variable 'ansible_search_path' from source: unknown 30575 1726867641.83713: calling self._execute() 30575 1726867641.83785: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.83788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.83800: variable 'omit' from source: magic vars 30575 1726867641.84105: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.84114: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.84203: variable 'network_state' from source: role '' defaults 30575 1726867641.84214: Evaluated conditional (network_state != {}): False 30575 1726867641.84219: when evaluation is False, skipping this task 30575 1726867641.84221: _execute() done 30575 1726867641.84224: dumping result to json 30575 1726867641.84229: done dumping result, returning 30575 1726867641.84232: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-00000000184b] 30575 1726867641.84243: sending task result for task 0affcac9-a3a5-e081-a588-00000000184b 30575 1726867641.84329: done sending task result for task 0affcac9-a3a5-e081-a588-00000000184b 30575 1726867641.84331: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867641.84395: no more pending results, returning what we have 30575 1726867641.84399: results queue empty 30575 1726867641.84400: checking for any_errors_fatal 30575 1726867641.84418: done checking for any_errors_fatal 30575 1726867641.84419: checking for max_fail_percentage 30575 1726867641.84421: done checking for max_fail_percentage 30575 1726867641.84422: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.84422: done checking to see if all hosts have failed 30575 1726867641.84423: getting the remaining hosts for this loop 30575 1726867641.84424: done getting the remaining hosts for this loop 30575 1726867641.84428: getting the next task for host managed_node3 30575 1726867641.84435: done getting next task for host managed_node3 30575 1726867641.84438: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867641.84444: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.84463: getting variables 30575 1726867641.84465: in VariableManager get_vars() 30575 1726867641.84524: Calling all_inventory to load vars for managed_node3 30575 1726867641.84527: Calling groups_inventory to load vars for managed_node3 30575 1726867641.84529: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.84539: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.84541: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.84546: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.85484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.86505: done with get_vars() 30575 1726867641.86530: done getting variables 30575 1726867641.86593: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:21 -0400 (0:00:00.036) 0:01:17.243 ****** 30575 1726867641.86636: entering _queue_task() for managed_node3/debug 30575 1726867641.86864: worker is 1 (out of 1 available) 30575 1726867641.86880: exiting _queue_task() for managed_node3/debug 30575 1726867641.86892: done queuing things up, now waiting for results queue to drain 30575 1726867641.86894: waiting for pending results... 30575 1726867641.87103: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867641.87259: in run() - task 0affcac9-a3a5-e081-a588-00000000184c 30575 1726867641.87263: variable 'ansible_search_path' from source: unknown 30575 1726867641.87266: variable 'ansible_search_path' from source: unknown 30575 1726867641.87286: calling self._execute() 30575 1726867641.87371: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.87390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.87439: variable 'omit' from source: magic vars 30575 1726867641.87790: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.87795: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.87802: variable 'omit' from source: magic vars 30575 1726867641.87859: variable 'omit' from source: magic vars 30575 1726867641.87888: variable 'omit' from source: magic vars 30575 1726867641.87963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867641.87976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867641.88023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867641.88026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867641.88031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867641.88081: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867641.88086: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.88089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.88235: Set connection var ansible_pipelining to False 30575 1726867641.88238: Set connection var ansible_shell_type to sh 30575 1726867641.88241: Set connection var ansible_shell_executable to /bin/sh 30575 1726867641.88243: Set connection var ansible_timeout to 10 30575 1726867641.88245: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867641.88247: Set connection var ansible_connection to ssh 30575 1726867641.88250: variable 'ansible_shell_executable' from source: unknown 30575 1726867641.88252: variable 'ansible_connection' from source: unknown 30575 1726867641.88254: variable 'ansible_module_compression' from source: unknown 30575 1726867641.88266: variable 'ansible_shell_type' from source: unknown 30575 1726867641.88270: variable 'ansible_shell_executable' from source: unknown 30575 1726867641.88272: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.88279: variable 'ansible_pipelining' from source: unknown 30575 1726867641.88281: variable 'ansible_timeout' from source: unknown 30575 1726867641.88283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.88427: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867641.88431: variable 'omit' from source: magic vars 30575 1726867641.88434: starting attempt loop 30575 1726867641.88436: running the handler 30575 1726867641.88563: variable '__network_connections_result' from source: set_fact 30575 1726867641.88686: handler run complete 30575 1726867641.88690: attempt loop complete, returning result 30575 1726867641.88692: _execute() done 30575 1726867641.88695: dumping result to json 30575 1726867641.88697: done dumping result, returning 30575 1726867641.88699: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-00000000184c] 30575 1726867641.88701: sending task result for task 0affcac9-a3a5-e081-a588-00000000184c 30575 1726867641.88787: done sending task result for task 0affcac9-a3a5-e081-a588-00000000184c 30575 1726867641.88793: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9" ] } 30575 1726867641.88862: no more pending results, returning what we have 30575 1726867641.88866: results queue empty 30575 1726867641.88866: checking for any_errors_fatal 30575 1726867641.88873: done checking for any_errors_fatal 30575 1726867641.88874: checking for max_fail_percentage 30575 1726867641.88875: done checking for max_fail_percentage 30575 1726867641.88876: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.88879: done checking to see if all hosts have failed 30575 1726867641.88879: getting the remaining hosts for this loop 30575 1726867641.88881: done getting the remaining hosts for this loop 30575 1726867641.88884: getting the next task for host managed_node3 30575 1726867641.88892: done getting next task for host managed_node3 30575 1726867641.88898: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867641.88903: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.88915: getting variables 30575 1726867641.88917: in VariableManager get_vars() 30575 1726867641.88955: Calling all_inventory to load vars for managed_node3 30575 1726867641.88958: Calling groups_inventory to load vars for managed_node3 30575 1726867641.88960: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.88971: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.88974: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.88976: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.90235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.91520: done with get_vars() 30575 1726867641.91535: done getting variables 30575 1726867641.91576: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:21 -0400 (0:00:00.049) 0:01:17.293 ****** 30575 1726867641.91605: entering _queue_task() for managed_node3/debug 30575 1726867641.91839: worker is 1 (out of 1 available) 30575 1726867641.91853: exiting _queue_task() for managed_node3/debug 30575 1726867641.91867: done queuing things up, now waiting for results queue to drain 30575 1726867641.91869: waiting for pending results... 30575 1726867641.92066: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867641.92168: in run() - task 0affcac9-a3a5-e081-a588-00000000184d 30575 1726867641.92180: variable 'ansible_search_path' from source: unknown 30575 1726867641.92184: variable 'ansible_search_path' from source: unknown 30575 1726867641.92217: calling self._execute() 30575 1726867641.92288: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.92291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.92301: variable 'omit' from source: magic vars 30575 1726867641.92681: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.92712: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.92715: variable 'omit' from source: magic vars 30575 1726867641.92760: variable 'omit' from source: magic vars 30575 1726867641.92940: variable 'omit' from source: magic vars 30575 1726867641.92943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867641.92946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867641.92949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867641.92952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867641.92955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867641.93000: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867641.93003: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.93005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.93168: Set connection var ansible_pipelining to False 30575 1726867641.93172: Set connection var ansible_shell_type to sh 30575 1726867641.93174: Set connection var ansible_shell_executable to /bin/sh 30575 1726867641.93178: Set connection var ansible_timeout to 10 30575 1726867641.93181: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867641.93183: Set connection var ansible_connection to ssh 30575 1726867641.93185: variable 'ansible_shell_executable' from source: unknown 30575 1726867641.93187: variable 'ansible_connection' from source: unknown 30575 1726867641.93189: variable 'ansible_module_compression' from source: unknown 30575 1726867641.93191: variable 'ansible_shell_type' from source: unknown 30575 1726867641.93193: variable 'ansible_shell_executable' from source: unknown 30575 1726867641.93195: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.93197: variable 'ansible_pipelining' from source: unknown 30575 1726867641.93199: variable 'ansible_timeout' from source: unknown 30575 1726867641.93201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.93310: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867641.93318: variable 'omit' from source: magic vars 30575 1726867641.93323: starting attempt loop 30575 1726867641.93326: running the handler 30575 1726867641.93375: variable '__network_connections_result' from source: set_fact 30575 1726867641.93448: variable '__network_connections_result' from source: set_fact 30575 1726867641.93542: handler run complete 30575 1726867641.93559: attempt loop complete, returning result 30575 1726867641.93562: _execute() done 30575 1726867641.93564: dumping result to json 30575 1726867641.93567: done dumping result, returning 30575 1726867641.93579: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-00000000184d] 30575 1726867641.93582: sending task result for task 0affcac9-a3a5-e081-a588-00000000184d 30575 1726867641.93709: done sending task result for task 0affcac9-a3a5-e081-a588-00000000184d 30575 1726867641.93712: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9" ] } } 30575 1726867641.93812: no more pending results, returning what we have 30575 1726867641.93815: results queue empty 30575 1726867641.93816: checking for any_errors_fatal 30575 1726867641.93822: done checking for any_errors_fatal 30575 1726867641.93823: checking for max_fail_percentage 30575 1726867641.93824: done checking for max_fail_percentage 30575 1726867641.93825: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.93825: done checking to see if all hosts have failed 30575 1726867641.93826: getting the remaining hosts for this loop 30575 1726867641.93827: done getting the remaining hosts for this loop 30575 1726867641.93834: getting the next task for host managed_node3 30575 1726867641.93841: done getting next task for host managed_node3 30575 1726867641.93845: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867641.93849: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.93860: getting variables 30575 1726867641.93861: in VariableManager get_vars() 30575 1726867641.93931: Calling all_inventory to load vars for managed_node3 30575 1726867641.93933: Calling groups_inventory to load vars for managed_node3 30575 1726867641.93935: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.93941: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.93942: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.93944: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.94887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.96150: done with get_vars() 30575 1726867641.96165: done getting variables 30575 1726867641.96206: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:21 -0400 (0:00:00.046) 0:01:17.339 ****** 30575 1726867641.96234: entering _queue_task() for managed_node3/debug 30575 1726867641.96517: worker is 1 (out of 1 available) 30575 1726867641.96531: exiting _queue_task() for managed_node3/debug 30575 1726867641.96544: done queuing things up, now waiting for results queue to drain 30575 1726867641.96546: waiting for pending results... 30575 1726867641.96740: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867641.96839: in run() - task 0affcac9-a3a5-e081-a588-00000000184e 30575 1726867641.96849: variable 'ansible_search_path' from source: unknown 30575 1726867641.96853: variable 'ansible_search_path' from source: unknown 30575 1726867641.96888: calling self._execute() 30575 1726867641.96957: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867641.96961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867641.96972: variable 'omit' from source: magic vars 30575 1726867641.97264: variable 'ansible_distribution_major_version' from source: facts 30575 1726867641.97273: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867641.97360: variable 'network_state' from source: role '' defaults 30575 1726867641.97368: Evaluated conditional (network_state != {}): False 30575 1726867641.97373: when evaluation is False, skipping this task 30575 1726867641.97376: _execute() done 30575 1726867641.97381: dumping result to json 30575 1726867641.97386: done dumping result, returning 30575 1726867641.97395: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-00000000184e] 30575 1726867641.97400: sending task result for task 0affcac9-a3a5-e081-a588-00000000184e 30575 1726867641.97490: done sending task result for task 0affcac9-a3a5-e081-a588-00000000184e 30575 1726867641.97495: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867641.97547: no more pending results, returning what we have 30575 1726867641.97550: results queue empty 30575 1726867641.97551: checking for any_errors_fatal 30575 1726867641.97560: done checking for any_errors_fatal 30575 1726867641.97561: checking for max_fail_percentage 30575 1726867641.97562: done checking for max_fail_percentage 30575 1726867641.97563: checking to see if all hosts have failed and the running result is not ok 30575 1726867641.97564: done checking to see if all hosts have failed 30575 1726867641.97564: getting the remaining hosts for this loop 30575 1726867641.97566: done getting the remaining hosts for this loop 30575 1726867641.97570: getting the next task for host managed_node3 30575 1726867641.97579: done getting next task for host managed_node3 30575 1726867641.97583: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867641.97587: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867641.97606: getting variables 30575 1726867641.97607: in VariableManager get_vars() 30575 1726867641.97648: Calling all_inventory to load vars for managed_node3 30575 1726867641.97650: Calling groups_inventory to load vars for managed_node3 30575 1726867641.97652: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867641.97661: Calling all_plugins_play to load vars for managed_node3 30575 1726867641.97663: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867641.97665: Calling groups_plugins_play to load vars for managed_node3 30575 1726867641.98604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867641.99717: done with get_vars() 30575 1726867641.99731: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:21 -0400 (0:00:00.035) 0:01:17.375 ****** 30575 1726867641.99799: entering _queue_task() for managed_node3/ping 30575 1726867642.00029: worker is 1 (out of 1 available) 30575 1726867642.00043: exiting _queue_task() for managed_node3/ping 30575 1726867642.00056: done queuing things up, now waiting for results queue to drain 30575 1726867642.00058: waiting for pending results... 30575 1726867642.00439: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867642.00461: in run() - task 0affcac9-a3a5-e081-a588-00000000184f 30575 1726867642.00466: variable 'ansible_search_path' from source: unknown 30575 1726867642.00469: variable 'ansible_search_path' from source: unknown 30575 1726867642.00519: calling self._execute() 30575 1726867642.00605: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.00611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.00614: variable 'omit' from source: magic vars 30575 1726867642.01008: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.01028: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.01032: variable 'omit' from source: magic vars 30575 1726867642.01131: variable 'omit' from source: magic vars 30575 1726867642.01134: variable 'omit' from source: magic vars 30575 1726867642.01171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867642.01225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867642.01229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867642.01243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867642.01254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867642.01281: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867642.01284: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.01287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.01360: Set connection var ansible_pipelining to False 30575 1726867642.01363: Set connection var ansible_shell_type to sh 30575 1726867642.01368: Set connection var ansible_shell_executable to /bin/sh 30575 1726867642.01373: Set connection var ansible_timeout to 10 30575 1726867642.01382: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867642.01389: Set connection var ansible_connection to ssh 30575 1726867642.01423: variable 'ansible_shell_executable' from source: unknown 30575 1726867642.01426: variable 'ansible_connection' from source: unknown 30575 1726867642.01429: variable 'ansible_module_compression' from source: unknown 30575 1726867642.01431: variable 'ansible_shell_type' from source: unknown 30575 1726867642.01433: variable 'ansible_shell_executable' from source: unknown 30575 1726867642.01435: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.01438: variable 'ansible_pipelining' from source: unknown 30575 1726867642.01440: variable 'ansible_timeout' from source: unknown 30575 1726867642.01442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.01625: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867642.01630: variable 'omit' from source: magic vars 30575 1726867642.01632: starting attempt loop 30575 1726867642.01639: running the handler 30575 1726867642.01641: _low_level_execute_command(): starting 30575 1726867642.01649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867642.02308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867642.02312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.02316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.02360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867642.02367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867642.02381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.02445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867642.04138: stdout chunk (state=3): >>>/root <<< 30575 1726867642.04236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867642.04262: stderr chunk (state=3): >>><<< 30575 1726867642.04266: stdout chunk (state=3): >>><<< 30575 1726867642.04286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867642.04297: _low_level_execute_command(): starting 30575 1726867642.04302: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950 `" && echo ansible-tmp-1726867642.0428603-34196-75048200328950="` echo /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950 `" ) && sleep 0' 30575 1726867642.04718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867642.04721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.04723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867642.04732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.04773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867642.04776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.04827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867642.06730: stdout chunk (state=3): >>>ansible-tmp-1726867642.0428603-34196-75048200328950=/root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950 <<< 30575 1726867642.06836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867642.06859: stderr chunk (state=3): >>><<< 30575 1726867642.06863: stdout chunk (state=3): >>><<< 30575 1726867642.06879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867642.0428603-34196-75048200328950=/root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867642.06920: variable 'ansible_module_compression' from source: unknown 30575 1726867642.06951: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867642.06979: variable 'ansible_facts' from source: unknown 30575 1726867642.07040: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/AnsiballZ_ping.py 30575 1726867642.07136: Sending initial data 30575 1726867642.07140: Sent initial data (152 bytes) 30575 1726867642.07547: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867642.07588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867642.07591: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867642.07593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.07595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867642.07597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867642.07599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.07640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867642.07643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.07697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867642.09237: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867642.09241: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867642.09279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867642.09327: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_ijmlcrl /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/AnsiballZ_ping.py <<< 30575 1726867642.09330: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/AnsiballZ_ping.py" <<< 30575 1726867642.09368: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_ijmlcrl" to remote "/root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/AnsiballZ_ping.py" <<< 30575 1726867642.09371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/AnsiballZ_ping.py" <<< 30575 1726867642.09890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867642.09926: stderr chunk (state=3): >>><<< 30575 1726867642.09929: stdout chunk (state=3): >>><<< 30575 1726867642.09971: done transferring module to remote 30575 1726867642.09980: _low_level_execute_command(): starting 30575 1726867642.09986: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/ /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/AnsiballZ_ping.py && sleep 0' 30575 1726867642.10445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867642.10448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867642.10451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.10453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867642.10456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.10504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867642.10507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.10558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867642.12347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867642.12350: stdout chunk (state=3): >>><<< 30575 1726867642.12352: stderr chunk (state=3): >>><<< 30575 1726867642.12365: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867642.12441: _low_level_execute_command(): starting 30575 1726867642.12444: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/AnsiballZ_ping.py && sleep 0' 30575 1726867642.12939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867642.12946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867642.12957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867642.12971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867642.12984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867642.12992: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867642.13001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.13019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867642.13026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867642.13034: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867642.13041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867642.13051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867642.13064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867642.13072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867642.13084: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867642.13094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.13160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867642.13180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867642.13196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.13271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867642.28155: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867642.29374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867642.29396: stderr chunk (state=3): >>><<< 30575 1726867642.29400: stdout chunk (state=3): >>><<< 30575 1726867642.29416: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867642.29527: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867642.29533: _low_level_execute_command(): starting 30575 1726867642.29536: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867642.0428603-34196-75048200328950/ > /dev/null 2>&1 && sleep 0' 30575 1726867642.30119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.30163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867642.31976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867642.32002: stderr chunk (state=3): >>><<< 30575 1726867642.32006: stdout chunk (state=3): >>><<< 30575 1726867642.32021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867642.32026: handler run complete 30575 1726867642.32038: attempt loop complete, returning result 30575 1726867642.32041: _execute() done 30575 1726867642.32044: dumping result to json 30575 1726867642.32046: done dumping result, returning 30575 1726867642.32055: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-00000000184f] 30575 1726867642.32060: sending task result for task 0affcac9-a3a5-e081-a588-00000000184f 30575 1726867642.32142: done sending task result for task 0affcac9-a3a5-e081-a588-00000000184f 30575 1726867642.32145: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867642.32210: no more pending results, returning what we have 30575 1726867642.32213: results queue empty 30575 1726867642.32214: checking for any_errors_fatal 30575 1726867642.32222: done checking for any_errors_fatal 30575 1726867642.32223: checking for max_fail_percentage 30575 1726867642.32224: done checking for max_fail_percentage 30575 1726867642.32225: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.32226: done checking to see if all hosts have failed 30575 1726867642.32227: getting the remaining hosts for this loop 30575 1726867642.32228: done getting the remaining hosts for this loop 30575 1726867642.32234: getting the next task for host managed_node3 30575 1726867642.32246: done getting next task for host managed_node3 30575 1726867642.32248: ^ task is: TASK: meta (role_complete) 30575 1726867642.32254: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.32267: getting variables 30575 1726867642.32269: in VariableManager get_vars() 30575 1726867642.32316: Calling all_inventory to load vars for managed_node3 30575 1726867642.32318: Calling groups_inventory to load vars for managed_node3 30575 1726867642.32321: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.32330: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.32332: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.32335: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.33799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.34660: done with get_vars() 30575 1726867642.34676: done getting variables 30575 1726867642.34738: done queuing things up, now waiting for results queue to drain 30575 1726867642.34739: results queue empty 30575 1726867642.34740: checking for any_errors_fatal 30575 1726867642.34741: done checking for any_errors_fatal 30575 1726867642.34742: checking for max_fail_percentage 30575 1726867642.34742: done checking for max_fail_percentage 30575 1726867642.34743: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.34743: done checking to see if all hosts have failed 30575 1726867642.34744: getting the remaining hosts for this loop 30575 1726867642.34744: done getting the remaining hosts for this loop 30575 1726867642.34746: getting the next task for host managed_node3 30575 1726867642.34749: done getting next task for host managed_node3 30575 1726867642.34750: ^ task is: TASK: Show result 30575 1726867642.34752: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.34754: getting variables 30575 1726867642.34754: in VariableManager get_vars() 30575 1726867642.34761: Calling all_inventory to load vars for managed_node3 30575 1726867642.34763: Calling groups_inventory to load vars for managed_node3 30575 1726867642.34764: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.34767: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.34769: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.34770: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.35402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.36270: done with get_vars() 30575 1726867642.36285: done getting variables 30575 1726867642.36316: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 17:27:22 -0400 (0:00:00.365) 0:01:17.740 ****** 30575 1726867642.36341: entering _queue_task() for managed_node3/debug 30575 1726867642.36583: worker is 1 (out of 1 available) 30575 1726867642.36598: exiting _queue_task() for managed_node3/debug 30575 1726867642.36614: done queuing things up, now waiting for results queue to drain 30575 1726867642.36615: waiting for pending results... 30575 1726867642.36798: running TaskExecutor() for managed_node3/TASK: Show result 30575 1726867642.36873: in run() - task 0affcac9-a3a5-e081-a588-0000000017d1 30575 1726867642.36885: variable 'ansible_search_path' from source: unknown 30575 1726867642.36889: variable 'ansible_search_path' from source: unknown 30575 1726867642.36919: calling self._execute() 30575 1726867642.36999: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.37003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.37013: variable 'omit' from source: magic vars 30575 1726867642.37295: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.37303: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.37311: variable 'omit' from source: magic vars 30575 1726867642.37342: variable 'omit' from source: magic vars 30575 1726867642.37365: variable 'omit' from source: magic vars 30575 1726867642.37401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867642.37428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867642.37444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867642.37456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867642.37467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867642.37495: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867642.37498: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.37501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.37571: Set connection var ansible_pipelining to False 30575 1726867642.37574: Set connection var ansible_shell_type to sh 30575 1726867642.37579: Set connection var ansible_shell_executable to /bin/sh 30575 1726867642.37591: Set connection var ansible_timeout to 10 30575 1726867642.37597: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867642.37603: Set connection var ansible_connection to ssh 30575 1726867642.37625: variable 'ansible_shell_executable' from source: unknown 30575 1726867642.37628: variable 'ansible_connection' from source: unknown 30575 1726867642.37630: variable 'ansible_module_compression' from source: unknown 30575 1726867642.37633: variable 'ansible_shell_type' from source: unknown 30575 1726867642.37635: variable 'ansible_shell_executable' from source: unknown 30575 1726867642.37637: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.37639: variable 'ansible_pipelining' from source: unknown 30575 1726867642.37641: variable 'ansible_timeout' from source: unknown 30575 1726867642.37647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.37748: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867642.37824: variable 'omit' from source: magic vars 30575 1726867642.37827: starting attempt loop 30575 1726867642.37832: running the handler 30575 1726867642.37834: variable '__network_connections_result' from source: set_fact 30575 1726867642.37858: variable '__network_connections_result' from source: set_fact 30575 1726867642.37945: handler run complete 30575 1726867642.37963: attempt loop complete, returning result 30575 1726867642.37966: _execute() done 30575 1726867642.37968: dumping result to json 30575 1726867642.37973: done dumping result, returning 30575 1726867642.38042: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-e081-a588-0000000017d1] 30575 1726867642.38045: sending task result for task 0affcac9-a3a5-e081-a588-0000000017d1 30575 1726867642.38110: done sending task result for task 0affcac9-a3a5-e081-a588-0000000017d1 30575 1726867642.38113: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9" ] } } 30575 1726867642.38207: no more pending results, returning what we have 30575 1726867642.38212: results queue empty 30575 1726867642.38213: checking for any_errors_fatal 30575 1726867642.38215: done checking for any_errors_fatal 30575 1726867642.38216: checking for max_fail_percentage 30575 1726867642.38217: done checking for max_fail_percentage 30575 1726867642.38218: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.38219: done checking to see if all hosts have failed 30575 1726867642.38220: getting the remaining hosts for this loop 30575 1726867642.38221: done getting the remaining hosts for this loop 30575 1726867642.38230: getting the next task for host managed_node3 30575 1726867642.38239: done getting next task for host managed_node3 30575 1726867642.38242: ^ task is: TASK: Include network role 30575 1726867642.38245: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.38249: getting variables 30575 1726867642.38250: in VariableManager get_vars() 30575 1726867642.38274: Calling all_inventory to load vars for managed_node3 30575 1726867642.38275: Calling groups_inventory to load vars for managed_node3 30575 1726867642.38280: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.38286: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.38288: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.38290: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.39137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.39986: done with get_vars() 30575 1726867642.40000: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 17:27:22 -0400 (0:00:00.037) 0:01:17.778 ****** 30575 1726867642.40062: entering _queue_task() for managed_node3/include_role 30575 1726867642.40269: worker is 1 (out of 1 available) 30575 1726867642.40285: exiting _queue_task() for managed_node3/include_role 30575 1726867642.40297: done queuing things up, now waiting for results queue to drain 30575 1726867642.40298: waiting for pending results... 30575 1726867642.40472: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867642.40565: in run() - task 0affcac9-a3a5-e081-a588-0000000017d5 30575 1726867642.40698: variable 'ansible_search_path' from source: unknown 30575 1726867642.40702: variable 'ansible_search_path' from source: unknown 30575 1726867642.40705: calling self._execute() 30575 1726867642.40885: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.40889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.40892: variable 'omit' from source: magic vars 30575 1726867642.41271: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.41294: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.41311: _execute() done 30575 1726867642.41323: dumping result to json 30575 1726867642.41344: done dumping result, returning 30575 1726867642.41358: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-0000000017d5] 30575 1726867642.41369: sending task result for task 0affcac9-a3a5-e081-a588-0000000017d5 30575 1726867642.41589: no more pending results, returning what we have 30575 1726867642.41596: in VariableManager get_vars() 30575 1726867642.41652: Calling all_inventory to load vars for managed_node3 30575 1726867642.41656: Calling groups_inventory to load vars for managed_node3 30575 1726867642.41663: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.41680: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.41684: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.41688: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.42517: done sending task result for task 0affcac9-a3a5-e081-a588-0000000017d5 30575 1726867642.42520: WORKER PROCESS EXITING 30575 1726867642.45095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.46842: done with get_vars() 30575 1726867642.46862: variable 'ansible_search_path' from source: unknown 30575 1726867642.46863: variable 'ansible_search_path' from source: unknown 30575 1726867642.47008: variable 'omit' from source: magic vars 30575 1726867642.47051: variable 'omit' from source: magic vars 30575 1726867642.47065: variable 'omit' from source: magic vars 30575 1726867642.47069: we have included files to process 30575 1726867642.47070: generating all_blocks data 30575 1726867642.47071: done generating all_blocks data 30575 1726867642.47076: processing included file: fedora.linux_system_roles.network 30575 1726867642.47101: in VariableManager get_vars() 30575 1726867642.47116: done with get_vars() 30575 1726867642.47143: in VariableManager get_vars() 30575 1726867642.47160: done with get_vars() 30575 1726867642.47199: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867642.47321: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867642.47369: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867642.47637: in VariableManager get_vars() 30575 1726867642.47651: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867642.48878: iterating over new_blocks loaded from include file 30575 1726867642.48880: in VariableManager get_vars() 30575 1726867642.48892: done with get_vars() 30575 1726867642.48893: filtering new block on tags 30575 1726867642.49052: done filtering new block on tags 30575 1726867642.49055: in VariableManager get_vars() 30575 1726867642.49065: done with get_vars() 30575 1726867642.49066: filtering new block on tags 30575 1726867642.49080: done filtering new block on tags 30575 1726867642.49082: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867642.49091: extending task lists for all hosts with included blocks 30575 1726867642.49217: done extending task lists 30575 1726867642.49219: done processing included files 30575 1726867642.49219: results queue empty 30575 1726867642.49220: checking for any_errors_fatal 30575 1726867642.49224: done checking for any_errors_fatal 30575 1726867642.49224: checking for max_fail_percentage 30575 1726867642.49225: done checking for max_fail_percentage 30575 1726867642.49226: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.49227: done checking to see if all hosts have failed 30575 1726867642.49228: getting the remaining hosts for this loop 30575 1726867642.49229: done getting the remaining hosts for this loop 30575 1726867642.49231: getting the next task for host managed_node3 30575 1726867642.49236: done getting next task for host managed_node3 30575 1726867642.49238: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867642.49242: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.49251: getting variables 30575 1726867642.49252: in VariableManager get_vars() 30575 1726867642.49266: Calling all_inventory to load vars for managed_node3 30575 1726867642.49268: Calling groups_inventory to load vars for managed_node3 30575 1726867642.49270: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.49275: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.49279: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.49282: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.50451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.52193: done with get_vars() 30575 1726867642.52213: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:22 -0400 (0:00:00.122) 0:01:17.900 ****** 30575 1726867642.52285: entering _queue_task() for managed_node3/include_tasks 30575 1726867642.52651: worker is 1 (out of 1 available) 30575 1726867642.52662: exiting _queue_task() for managed_node3/include_tasks 30575 1726867642.52680: done queuing things up, now waiting for results queue to drain 30575 1726867642.52682: waiting for pending results... 30575 1726867642.52974: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867642.53183: in run() - task 0affcac9-a3a5-e081-a588-0000000019bf 30575 1726867642.53216: variable 'ansible_search_path' from source: unknown 30575 1726867642.53225: variable 'ansible_search_path' from source: unknown 30575 1726867642.53283: calling self._execute() 30575 1726867642.53390: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.53402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.53532: variable 'omit' from source: magic vars 30575 1726867642.53883: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.54083: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.54087: _execute() done 30575 1726867642.54089: dumping result to json 30575 1726867642.54092: done dumping result, returning 30575 1726867642.54095: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-0000000019bf] 30575 1726867642.54097: sending task result for task 0affcac9-a3a5-e081-a588-0000000019bf 30575 1726867642.54169: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019bf 30575 1726867642.54173: WORKER PROCESS EXITING 30575 1726867642.54223: no more pending results, returning what we have 30575 1726867642.54229: in VariableManager get_vars() 30575 1726867642.54270: Calling all_inventory to load vars for managed_node3 30575 1726867642.54272: Calling groups_inventory to load vars for managed_node3 30575 1726867642.54275: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.54287: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.54290: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.54292: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.55703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.57290: done with get_vars() 30575 1726867642.57314: variable 'ansible_search_path' from source: unknown 30575 1726867642.57316: variable 'ansible_search_path' from source: unknown 30575 1726867642.57353: we have included files to process 30575 1726867642.57355: generating all_blocks data 30575 1726867642.57357: done generating all_blocks data 30575 1726867642.57359: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867642.57360: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867642.57363: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867642.57951: done processing included file 30575 1726867642.57953: iterating over new_blocks loaded from include file 30575 1726867642.57955: in VariableManager get_vars() 30575 1726867642.57987: done with get_vars() 30575 1726867642.57989: filtering new block on tags 30575 1726867642.58020: done filtering new block on tags 30575 1726867642.58023: in VariableManager get_vars() 30575 1726867642.58043: done with get_vars() 30575 1726867642.58045: filtering new block on tags 30575 1726867642.58091: done filtering new block on tags 30575 1726867642.58093: in VariableManager get_vars() 30575 1726867642.58112: done with get_vars() 30575 1726867642.58114: filtering new block on tags 30575 1726867642.58152: done filtering new block on tags 30575 1726867642.58153: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867642.58158: extending task lists for all hosts with included blocks 30575 1726867642.59843: done extending task lists 30575 1726867642.59844: done processing included files 30575 1726867642.59845: results queue empty 30575 1726867642.59846: checking for any_errors_fatal 30575 1726867642.59849: done checking for any_errors_fatal 30575 1726867642.59850: checking for max_fail_percentage 30575 1726867642.59851: done checking for max_fail_percentage 30575 1726867642.59851: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.59852: done checking to see if all hosts have failed 30575 1726867642.59853: getting the remaining hosts for this loop 30575 1726867642.59854: done getting the remaining hosts for this loop 30575 1726867642.59857: getting the next task for host managed_node3 30575 1726867642.59862: done getting next task for host managed_node3 30575 1726867642.59864: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867642.59869: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.59880: getting variables 30575 1726867642.59882: in VariableManager get_vars() 30575 1726867642.59895: Calling all_inventory to load vars for managed_node3 30575 1726867642.59897: Calling groups_inventory to load vars for managed_node3 30575 1726867642.59899: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.59904: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.59906: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.59909: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.61083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.62202: done with get_vars() 30575 1726867642.62219: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:22 -0400 (0:00:00.099) 0:01:18.000 ****** 30575 1726867642.62269: entering _queue_task() for managed_node3/setup 30575 1726867642.62534: worker is 1 (out of 1 available) 30575 1726867642.62546: exiting _queue_task() for managed_node3/setup 30575 1726867642.62560: done queuing things up, now waiting for results queue to drain 30575 1726867642.62562: waiting for pending results... 30575 1726867642.62745: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867642.62842: in run() - task 0affcac9-a3a5-e081-a588-000000001a16 30575 1726867642.62854: variable 'ansible_search_path' from source: unknown 30575 1726867642.62857: variable 'ansible_search_path' from source: unknown 30575 1726867642.62889: calling self._execute() 30575 1726867642.63011: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.63015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.63018: variable 'omit' from source: magic vars 30575 1726867642.63290: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.63320: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.63497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867642.65403: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867642.65406: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867642.65413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867642.65435: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867642.65455: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867642.65528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867642.65557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867642.65588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867642.65625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867642.65640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867642.65693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867642.65716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867642.65739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867642.65775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867642.65791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867642.65932: variable '__network_required_facts' from source: role '' defaults 30575 1726867642.65940: variable 'ansible_facts' from source: unknown 30575 1726867642.66696: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867642.66699: when evaluation is False, skipping this task 30575 1726867642.66702: _execute() done 30575 1726867642.66704: dumping result to json 30575 1726867642.66706: done dumping result, returning 30575 1726867642.66711: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000001a16] 30575 1726867642.66713: sending task result for task 0affcac9-a3a5-e081-a588-000000001a16 30575 1726867642.66772: done sending task result for task 0affcac9-a3a5-e081-a588-000000001a16 30575 1726867642.66774: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867642.66822: no more pending results, returning what we have 30575 1726867642.66826: results queue empty 30575 1726867642.66827: checking for any_errors_fatal 30575 1726867642.66828: done checking for any_errors_fatal 30575 1726867642.66828: checking for max_fail_percentage 30575 1726867642.66830: done checking for max_fail_percentage 30575 1726867642.66831: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.66832: done checking to see if all hosts have failed 30575 1726867642.66832: getting the remaining hosts for this loop 30575 1726867642.66834: done getting the remaining hosts for this loop 30575 1726867642.66837: getting the next task for host managed_node3 30575 1726867642.66848: done getting next task for host managed_node3 30575 1726867642.66852: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867642.66858: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.67052: getting variables 30575 1726867642.67054: in VariableManager get_vars() 30575 1726867642.67099: Calling all_inventory to load vars for managed_node3 30575 1726867642.67102: Calling groups_inventory to load vars for managed_node3 30575 1726867642.67104: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.67115: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.67117: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.67127: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.68457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.70645: done with get_vars() 30575 1726867642.70666: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:22 -0400 (0:00:00.086) 0:01:18.086 ****** 30575 1726867642.70918: entering _queue_task() for managed_node3/stat 30575 1726867642.71509: worker is 1 (out of 1 available) 30575 1726867642.71525: exiting _queue_task() for managed_node3/stat 30575 1726867642.71538: done queuing things up, now waiting for results queue to drain 30575 1726867642.71540: waiting for pending results... 30575 1726867642.71998: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867642.72105: in run() - task 0affcac9-a3a5-e081-a588-000000001a18 30575 1726867642.72138: variable 'ansible_search_path' from source: unknown 30575 1726867642.72143: variable 'ansible_search_path' from source: unknown 30575 1726867642.72182: calling self._execute() 30575 1726867642.72414: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.72418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.72428: variable 'omit' from source: magic vars 30575 1726867642.72998: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.73002: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.73111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867642.73522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867642.73638: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867642.73773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867642.73807: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867642.73999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867642.74063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867642.74112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867642.74151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867642.74260: variable '__network_is_ostree' from source: set_fact 30575 1726867642.74285: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867642.74311: when evaluation is False, skipping this task 30575 1726867642.74315: _execute() done 30575 1726867642.74317: dumping result to json 30575 1726867642.74320: done dumping result, returning 30575 1726867642.74322: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000001a18] 30575 1726867642.74384: sending task result for task 0affcac9-a3a5-e081-a588-000000001a18 30575 1726867642.74466: done sending task result for task 0affcac9-a3a5-e081-a588-000000001a18 30575 1726867642.74470: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867642.74522: no more pending results, returning what we have 30575 1726867642.74526: results queue empty 30575 1726867642.74526: checking for any_errors_fatal 30575 1726867642.74536: done checking for any_errors_fatal 30575 1726867642.74536: checking for max_fail_percentage 30575 1726867642.74538: done checking for max_fail_percentage 30575 1726867642.74539: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.74540: done checking to see if all hosts have failed 30575 1726867642.74541: getting the remaining hosts for this loop 30575 1726867642.74542: done getting the remaining hosts for this loop 30575 1726867642.74546: getting the next task for host managed_node3 30575 1726867642.74554: done getting next task for host managed_node3 30575 1726867642.74557: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867642.74562: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.74587: getting variables 30575 1726867642.74589: in VariableManager get_vars() 30575 1726867642.74629: Calling all_inventory to load vars for managed_node3 30575 1726867642.74632: Calling groups_inventory to load vars for managed_node3 30575 1726867642.74634: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.74644: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.74647: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.74649: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.83436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.85239: done with get_vars() 30575 1726867642.85265: done getting variables 30575 1726867642.85313: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:22 -0400 (0:00:00.144) 0:01:18.231 ****** 30575 1726867642.85345: entering _queue_task() for managed_node3/set_fact 30575 1726867642.85707: worker is 1 (out of 1 available) 30575 1726867642.85720: exiting _queue_task() for managed_node3/set_fact 30575 1726867642.85734: done queuing things up, now waiting for results queue to drain 30575 1726867642.85737: waiting for pending results... 30575 1726867642.86099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867642.86219: in run() - task 0affcac9-a3a5-e081-a588-000000001a19 30575 1726867642.86228: variable 'ansible_search_path' from source: unknown 30575 1726867642.86234: variable 'ansible_search_path' from source: unknown 30575 1726867642.86270: calling self._execute() 30575 1726867642.86362: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.86370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.86397: variable 'omit' from source: magic vars 30575 1726867642.86774: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.86785: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.86930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867642.87231: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867642.87280: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867642.87346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867642.87388: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867642.87472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867642.87525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867642.87571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867642.87604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867642.87704: variable '__network_is_ostree' from source: set_fact 30575 1726867642.87723: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867642.87726: when evaluation is False, skipping this task 30575 1726867642.87729: _execute() done 30575 1726867642.87731: dumping result to json 30575 1726867642.87734: done dumping result, returning 30575 1726867642.87737: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000001a19] 30575 1726867642.87740: sending task result for task 0affcac9-a3a5-e081-a588-000000001a19 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867642.87913: no more pending results, returning what we have 30575 1726867642.87919: results queue empty 30575 1726867642.87920: checking for any_errors_fatal 30575 1726867642.87929: done checking for any_errors_fatal 30575 1726867642.87930: checking for max_fail_percentage 30575 1726867642.87934: done checking for max_fail_percentage 30575 1726867642.87935: checking to see if all hosts have failed and the running result is not ok 30575 1726867642.87936: done checking to see if all hosts have failed 30575 1726867642.87937: getting the remaining hosts for this loop 30575 1726867642.87939: done getting the remaining hosts for this loop 30575 1726867642.87945: getting the next task for host managed_node3 30575 1726867642.87961: done getting next task for host managed_node3 30575 1726867642.87967: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867642.87975: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867642.87994: done sending task result for task 0affcac9-a3a5-e081-a588-000000001a19 30575 1726867642.88000: WORKER PROCESS EXITING 30575 1726867642.88018: getting variables 30575 1726867642.88021: in VariableManager get_vars() 30575 1726867642.88181: Calling all_inventory to load vars for managed_node3 30575 1726867642.88211: Calling groups_inventory to load vars for managed_node3 30575 1726867642.88215: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867642.88291: Calling all_plugins_play to load vars for managed_node3 30575 1726867642.88296: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867642.88300: Calling groups_plugins_play to load vars for managed_node3 30575 1726867642.90598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867642.92626: done with get_vars() 30575 1726867642.92652: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:22 -0400 (0:00:00.074) 0:01:18.305 ****** 30575 1726867642.92756: entering _queue_task() for managed_node3/service_facts 30575 1726867642.93221: worker is 1 (out of 1 available) 30575 1726867642.93235: exiting _queue_task() for managed_node3/service_facts 30575 1726867642.93273: done queuing things up, now waiting for results queue to drain 30575 1726867642.93275: waiting for pending results... 30575 1726867642.93696: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867642.93800: in run() - task 0affcac9-a3a5-e081-a588-000000001a1b 30575 1726867642.93826: variable 'ansible_search_path' from source: unknown 30575 1726867642.93834: variable 'ansible_search_path' from source: unknown 30575 1726867642.93881: calling self._execute() 30575 1726867642.93999: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.94019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.94039: variable 'omit' from source: magic vars 30575 1726867642.94490: variable 'ansible_distribution_major_version' from source: facts 30575 1726867642.94510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867642.94530: variable 'omit' from source: magic vars 30575 1726867642.94694: variable 'omit' from source: magic vars 30575 1726867642.94748: variable 'omit' from source: magic vars 30575 1726867642.94800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867642.94849: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867642.94883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867642.94913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867642.94945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867642.94990: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867642.95001: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.95013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.95140: Set connection var ansible_pipelining to False 30575 1726867642.95155: Set connection var ansible_shell_type to sh 30575 1726867642.95168: Set connection var ansible_shell_executable to /bin/sh 30575 1726867642.95262: Set connection var ansible_timeout to 10 30575 1726867642.95270: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867642.95273: Set connection var ansible_connection to ssh 30575 1726867642.95276: variable 'ansible_shell_executable' from source: unknown 30575 1726867642.95281: variable 'ansible_connection' from source: unknown 30575 1726867642.95284: variable 'ansible_module_compression' from source: unknown 30575 1726867642.95286: variable 'ansible_shell_type' from source: unknown 30575 1726867642.95288: variable 'ansible_shell_executable' from source: unknown 30575 1726867642.95290: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867642.95292: variable 'ansible_pipelining' from source: unknown 30575 1726867642.95294: variable 'ansible_timeout' from source: unknown 30575 1726867642.95299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867642.95536: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867642.95565: variable 'omit' from source: magic vars 30575 1726867642.95578: starting attempt loop 30575 1726867642.95693: running the handler 30575 1726867642.95696: _low_level_execute_command(): starting 30575 1726867642.95698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867642.96501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.96583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867642.96607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.96704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867642.98420: stdout chunk (state=3): >>>/root <<< 30575 1726867642.98582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867642.98586: stdout chunk (state=3): >>><<< 30575 1726867642.98590: stderr chunk (state=3): >>><<< 30575 1726867642.98612: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867642.98718: _low_level_execute_command(): starting 30575 1726867642.98723: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201 `" && echo ansible-tmp-1726867642.9862218-34237-236864547780201="` echo /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201 `" ) && sleep 0' 30575 1726867642.99290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867642.99314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867642.99336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867642.99357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867642.99381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867642.99397: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867642.99447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867642.99540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867642.99584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867642.99652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867643.01598: stdout chunk (state=3): >>>ansible-tmp-1726867642.9862218-34237-236864547780201=/root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201 <<< 30575 1726867643.01769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867643.01773: stdout chunk (state=3): >>><<< 30575 1726867643.01776: stderr chunk (state=3): >>><<< 30575 1726867643.01918: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867642.9862218-34237-236864547780201=/root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867643.01921: variable 'ansible_module_compression' from source: unknown 30575 1726867643.01961: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867643.02046: variable 'ansible_facts' from source: unknown 30575 1726867643.02224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/AnsiballZ_service_facts.py 30575 1726867643.02486: Sending initial data 30575 1726867643.02489: Sent initial data (162 bytes) 30575 1726867643.03204: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867643.03211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867643.03249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867643.04793: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867643.04800: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867643.04843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867643.04922: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpf3v30cf3 /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/AnsiballZ_service_facts.py <<< 30575 1726867643.04925: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/AnsiballZ_service_facts.py" <<< 30575 1726867643.04946: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30575 1726867643.04951: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpf3v30cf3" to remote "/root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/AnsiballZ_service_facts.py" <<< 30575 1726867643.05587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867643.05646: stderr chunk (state=3): >>><<< 30575 1726867643.05654: stdout chunk (state=3): >>><<< 30575 1726867643.05657: done transferring module to remote 30575 1726867643.05659: _low_level_execute_command(): starting 30575 1726867643.05661: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/ /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/AnsiballZ_service_facts.py && sleep 0' 30575 1726867643.06257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867643.06261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867643.06263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867643.06265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867643.06313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867643.06316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867643.06368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867643.08111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867643.08142: stderr chunk (state=3): >>><<< 30575 1726867643.08144: stdout chunk (state=3): >>><<< 30575 1726867643.08156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867643.08220: _low_level_execute_command(): starting 30575 1726867643.08224: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/AnsiballZ_service_facts.py && sleep 0' 30575 1726867643.08710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867643.08714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867643.08716: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867643.08769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867643.08782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867643.08853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867644.59441: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867644.61087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867644.61102: stdout chunk (state=3): >>><<< 30575 1726867644.61148: stderr chunk (state=3): >>><<< 30575 1726867644.61517: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867644.63845: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867644.64269: _low_level_execute_command(): starting 30575 1726867644.64272: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867642.9862218-34237-236864547780201/ > /dev/null 2>&1 && sleep 0' 30575 1726867644.65451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867644.65486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867644.65505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867644.65696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867644.65768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867644.67700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867644.67718: stdout chunk (state=3): >>><<< 30575 1726867644.67730: stderr chunk (state=3): >>><<< 30575 1726867644.67752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867644.67766: handler run complete 30575 1726867644.68225: variable 'ansible_facts' from source: unknown 30575 1726867644.68538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867644.69650: variable 'ansible_facts' from source: unknown 30575 1726867644.70036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867644.70471: attempt loop complete, returning result 30575 1726867644.70485: _execute() done 30575 1726867644.70493: dumping result to json 30575 1726867644.70560: done dumping result, returning 30575 1726867644.70627: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000001a1b] 30575 1726867644.70638: sending task result for task 0affcac9-a3a5-e081-a588-000000001a1b 30575 1726867644.72311: done sending task result for task 0affcac9-a3a5-e081-a588-000000001a1b 30575 1726867644.72315: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867644.72490: no more pending results, returning what we have 30575 1726867644.72492: results queue empty 30575 1726867644.72493: checking for any_errors_fatal 30575 1726867644.72496: done checking for any_errors_fatal 30575 1726867644.72497: checking for max_fail_percentage 30575 1726867644.72499: done checking for max_fail_percentage 30575 1726867644.72499: checking to see if all hosts have failed and the running result is not ok 30575 1726867644.72500: done checking to see if all hosts have failed 30575 1726867644.72501: getting the remaining hosts for this loop 30575 1726867644.72502: done getting the remaining hosts for this loop 30575 1726867644.72505: getting the next task for host managed_node3 30575 1726867644.72514: done getting next task for host managed_node3 30575 1726867644.72517: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867644.72525: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867644.72536: getting variables 30575 1726867644.72537: in VariableManager get_vars() 30575 1726867644.72572: Calling all_inventory to load vars for managed_node3 30575 1726867644.72575: Calling groups_inventory to load vars for managed_node3 30575 1726867644.72579: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867644.72588: Calling all_plugins_play to load vars for managed_node3 30575 1726867644.72590: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867644.72593: Calling groups_plugins_play to load vars for managed_node3 30575 1726867644.74182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867644.76622: done with get_vars() 30575 1726867644.76647: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:24 -0400 (0:00:01.839) 0:01:20.145 ****** 30575 1726867644.76757: entering _queue_task() for managed_node3/package_facts 30575 1726867644.77120: worker is 1 (out of 1 available) 30575 1726867644.77133: exiting _queue_task() for managed_node3/package_facts 30575 1726867644.77146: done queuing things up, now waiting for results queue to drain 30575 1726867644.77148: waiting for pending results... 30575 1726867644.77513: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867644.77725: in run() - task 0affcac9-a3a5-e081-a588-000000001a1c 30575 1726867644.77746: variable 'ansible_search_path' from source: unknown 30575 1726867644.77755: variable 'ansible_search_path' from source: unknown 30575 1726867644.77828: calling self._execute() 30575 1726867644.77902: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867644.77920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867644.77943: variable 'omit' from source: magic vars 30575 1726867644.78882: variable 'ansible_distribution_major_version' from source: facts 30575 1726867644.78885: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867644.78888: variable 'omit' from source: magic vars 30575 1726867644.79127: variable 'omit' from source: magic vars 30575 1726867644.79131: variable 'omit' from source: magic vars 30575 1726867644.79133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867644.79202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867644.79272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867644.79334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867644.79346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867644.79379: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867644.79421: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867644.79424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867644.79549: Set connection var ansible_pipelining to False 30575 1726867644.79553: Set connection var ansible_shell_type to sh 30575 1726867644.79559: Set connection var ansible_shell_executable to /bin/sh 30575 1726867644.79562: Set connection var ansible_timeout to 10 30575 1726867644.79570: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867644.79576: Set connection var ansible_connection to ssh 30575 1726867644.79680: variable 'ansible_shell_executable' from source: unknown 30575 1726867644.79687: variable 'ansible_connection' from source: unknown 30575 1726867644.79690: variable 'ansible_module_compression' from source: unknown 30575 1726867644.79692: variable 'ansible_shell_type' from source: unknown 30575 1726867644.79695: variable 'ansible_shell_executable' from source: unknown 30575 1726867644.79697: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867644.79699: variable 'ansible_pipelining' from source: unknown 30575 1726867644.79701: variable 'ansible_timeout' from source: unknown 30575 1726867644.79703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867644.79827: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867644.79841: variable 'omit' from source: magic vars 30575 1726867644.79846: starting attempt loop 30575 1726867644.79850: running the handler 30575 1726867644.79869: _low_level_execute_command(): starting 30575 1726867644.79876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867644.80541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867644.80550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867644.80596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867644.80611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867644.80673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867644.80714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867644.80771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867644.82454: stdout chunk (state=3): >>>/root <<< 30575 1726867644.82634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867644.82637: stdout chunk (state=3): >>><<< 30575 1726867644.82639: stderr chunk (state=3): >>><<< 30575 1726867644.82655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867644.82672: _low_level_execute_command(): starting 30575 1726867644.82748: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267 `" && echo ansible-tmp-1726867644.826613-34339-16225369148267="` echo /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267 `" ) && sleep 0' 30575 1726867644.83690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867644.83740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867644.83756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867644.83780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867644.83922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867644.83956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867644.83975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867644.84130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867644.85944: stdout chunk (state=3): >>>ansible-tmp-1726867644.826613-34339-16225369148267=/root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267 <<< 30575 1726867644.86108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867644.86127: stdout chunk (state=3): >>><<< 30575 1726867644.86158: stderr chunk (state=3): >>><<< 30575 1726867644.86193: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867644.826613-34339-16225369148267=/root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867644.86257: variable 'ansible_module_compression' from source: unknown 30575 1726867644.86314: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867644.86391: variable 'ansible_facts' from source: unknown 30575 1726867644.86606: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/AnsiballZ_package_facts.py 30575 1726867644.86864: Sending initial data 30575 1726867644.86868: Sent initial data (160 bytes) 30575 1726867644.87526: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867644.87552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867644.87569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867644.87593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867644.87691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867644.89197: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867644.89227: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867644.89268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867644.89340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp1h_otn84 /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/AnsiballZ_package_facts.py <<< 30575 1726867644.89344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/AnsiballZ_package_facts.py" <<< 30575 1726867644.89451: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp1h_otn84" to remote "/root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/AnsiballZ_package_facts.py" <<< 30575 1726867644.91225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867644.91360: stderr chunk (state=3): >>><<< 30575 1726867644.91363: stdout chunk (state=3): >>><<< 30575 1726867644.91366: done transferring module to remote 30575 1726867644.91368: _low_level_execute_command(): starting 30575 1726867644.91371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/ /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/AnsiballZ_package_facts.py && sleep 0' 30575 1726867644.91946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867644.91960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867644.91975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867644.92001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867644.92061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867644.92125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867644.92140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867644.92218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867644.93998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867644.94013: stdout chunk (state=3): >>><<< 30575 1726867644.94026: stderr chunk (state=3): >>><<< 30575 1726867644.94084: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867644.94087: _low_level_execute_command(): starting 30575 1726867644.94090: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/AnsiballZ_package_facts.py && sleep 0' 30575 1726867644.94672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867644.94690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867644.94706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867644.94731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867644.94750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867644.94845: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867644.94863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867644.94884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867644.94907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867644.94981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867645.39226: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30575 1726867645.39393: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867645.39416: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867645.39482: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867645.41407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867645.41411: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867645.41413: stdout chunk (state=3): >>><<< 30575 1726867645.41416: stderr chunk (state=3): >>><<< 30575 1726867645.41437: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867645.46322: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867645.46327: _low_level_execute_command(): starting 30575 1726867645.46428: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867644.826613-34339-16225369148267/ > /dev/null 2>&1 && sleep 0' 30575 1726867645.47594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867645.47801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867645.47912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867645.49831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867645.49835: stdout chunk (state=3): >>><<< 30575 1726867645.49838: stderr chunk (state=3): >>><<< 30575 1726867645.50091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867645.50094: handler run complete 30575 1726867645.52002: variable 'ansible_facts' from source: unknown 30575 1726867645.53085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867645.57044: variable 'ansible_facts' from source: unknown 30575 1726867645.57944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867645.59820: attempt loop complete, returning result 30575 1726867645.59972: _execute() done 30575 1726867645.59984: dumping result to json 30575 1726867645.60629: done dumping result, returning 30575 1726867645.60647: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000001a1c] 30575 1726867645.60658: sending task result for task 0affcac9-a3a5-e081-a588-000000001a1c 30575 1726867645.66033: done sending task result for task 0affcac9-a3a5-e081-a588-000000001a1c ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867645.66144: WORKER PROCESS EXITING 30575 1726867645.66157: no more pending results, returning what we have 30575 1726867645.66160: results queue empty 30575 1726867645.66161: checking for any_errors_fatal 30575 1726867645.66166: done checking for any_errors_fatal 30575 1726867645.66167: checking for max_fail_percentage 30575 1726867645.66168: done checking for max_fail_percentage 30575 1726867645.66169: checking to see if all hosts have failed and the running result is not ok 30575 1726867645.66170: done checking to see if all hosts have failed 30575 1726867645.66171: getting the remaining hosts for this loop 30575 1726867645.66172: done getting the remaining hosts for this loop 30575 1726867645.66175: getting the next task for host managed_node3 30575 1726867645.66185: done getting next task for host managed_node3 30575 1726867645.66189: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867645.66194: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867645.66209: getting variables 30575 1726867645.66211: in VariableManager get_vars() 30575 1726867645.66243: Calling all_inventory to load vars for managed_node3 30575 1726867645.66246: Calling groups_inventory to load vars for managed_node3 30575 1726867645.66248: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867645.66257: Calling all_plugins_play to load vars for managed_node3 30575 1726867645.66259: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867645.66262: Calling groups_plugins_play to load vars for managed_node3 30575 1726867645.69014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867645.72110: done with get_vars() 30575 1726867645.72136: done getting variables 30575 1726867645.72199: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:25 -0400 (0:00:00.954) 0:01:21.100 ****** 30575 1726867645.72247: entering _queue_task() for managed_node3/debug 30575 1726867645.72784: worker is 1 (out of 1 available) 30575 1726867645.72794: exiting _queue_task() for managed_node3/debug 30575 1726867645.72804: done queuing things up, now waiting for results queue to drain 30575 1726867645.72806: waiting for pending results... 30575 1726867645.73036: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867645.73390: in run() - task 0affcac9-a3a5-e081-a588-0000000019c0 30575 1726867645.73395: variable 'ansible_search_path' from source: unknown 30575 1726867645.73398: variable 'ansible_search_path' from source: unknown 30575 1726867645.73401: calling self._execute() 30575 1726867645.73404: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867645.73407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867645.73409: variable 'omit' from source: magic vars 30575 1726867645.73988: variable 'ansible_distribution_major_version' from source: facts 30575 1726867645.74000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867645.74005: variable 'omit' from source: magic vars 30575 1726867645.74070: variable 'omit' from source: magic vars 30575 1726867645.74171: variable 'network_provider' from source: set_fact 30575 1726867645.74288: variable 'omit' from source: magic vars 30575 1726867645.74420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867645.74457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867645.74474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867645.74534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867645.74546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867645.74575: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867645.74580: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867645.74583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867645.74868: Set connection var ansible_pipelining to False 30575 1726867645.74872: Set connection var ansible_shell_type to sh 30575 1726867645.74876: Set connection var ansible_shell_executable to /bin/sh 30575 1726867645.74884: Set connection var ansible_timeout to 10 30575 1726867645.74889: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867645.74896: Set connection var ansible_connection to ssh 30575 1726867645.74920: variable 'ansible_shell_executable' from source: unknown 30575 1726867645.74924: variable 'ansible_connection' from source: unknown 30575 1726867645.74926: variable 'ansible_module_compression' from source: unknown 30575 1726867645.74928: variable 'ansible_shell_type' from source: unknown 30575 1726867645.74931: variable 'ansible_shell_executable' from source: unknown 30575 1726867645.74933: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867645.74935: variable 'ansible_pipelining' from source: unknown 30575 1726867645.74938: variable 'ansible_timeout' from source: unknown 30575 1726867645.74942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867645.75426: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867645.75435: variable 'omit' from source: magic vars 30575 1726867645.75438: starting attempt loop 30575 1726867645.75441: running the handler 30575 1726867645.75484: handler run complete 30575 1726867645.75498: attempt loop complete, returning result 30575 1726867645.75501: _execute() done 30575 1726867645.75504: dumping result to json 30575 1726867645.75586: done dumping result, returning 30575 1726867645.75596: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-0000000019c0] 30575 1726867645.75602: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c0 30575 1726867645.75942: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c0 30575 1726867645.75946: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867645.76018: no more pending results, returning what we have 30575 1726867645.76022: results queue empty 30575 1726867645.76023: checking for any_errors_fatal 30575 1726867645.76031: done checking for any_errors_fatal 30575 1726867645.76031: checking for max_fail_percentage 30575 1726867645.76033: done checking for max_fail_percentage 30575 1726867645.76034: checking to see if all hosts have failed and the running result is not ok 30575 1726867645.76035: done checking to see if all hosts have failed 30575 1726867645.76035: getting the remaining hosts for this loop 30575 1726867645.76037: done getting the remaining hosts for this loop 30575 1726867645.76041: getting the next task for host managed_node3 30575 1726867645.76050: done getting next task for host managed_node3 30575 1726867645.76057: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867645.76062: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867645.76085: getting variables 30575 1726867645.76087: in VariableManager get_vars() 30575 1726867645.76135: Calling all_inventory to load vars for managed_node3 30575 1726867645.76138: Calling groups_inventory to load vars for managed_node3 30575 1726867645.76141: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867645.76153: Calling all_plugins_play to load vars for managed_node3 30575 1726867645.76156: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867645.76159: Calling groups_plugins_play to load vars for managed_node3 30575 1726867645.77858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867645.80072: done with get_vars() 30575 1726867645.80297: done getting variables 30575 1726867645.80355: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:25 -0400 (0:00:00.081) 0:01:21.181 ****** 30575 1726867645.80396: entering _queue_task() for managed_node3/fail 30575 1726867645.80868: worker is 1 (out of 1 available) 30575 1726867645.80883: exiting _queue_task() for managed_node3/fail 30575 1726867645.80896: done queuing things up, now waiting for results queue to drain 30575 1726867645.80898: waiting for pending results... 30575 1726867645.81294: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867645.81369: in run() - task 0affcac9-a3a5-e081-a588-0000000019c1 30575 1726867645.81386: variable 'ansible_search_path' from source: unknown 30575 1726867645.81390: variable 'ansible_search_path' from source: unknown 30575 1726867645.81436: calling self._execute() 30575 1726867645.81668: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867645.81672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867645.81676: variable 'omit' from source: magic vars 30575 1726867645.82082: variable 'ansible_distribution_major_version' from source: facts 30575 1726867645.82086: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867645.82120: variable 'network_state' from source: role '' defaults 30575 1726867645.82131: Evaluated conditional (network_state != {}): False 30575 1726867645.82134: when evaluation is False, skipping this task 30575 1726867645.82137: _execute() done 30575 1726867645.82140: dumping result to json 30575 1726867645.82142: done dumping result, returning 30575 1726867645.82151: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-0000000019c1] 30575 1726867645.82162: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c1 30575 1726867645.82255: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c1 30575 1726867645.82258: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867645.82325: no more pending results, returning what we have 30575 1726867645.82329: results queue empty 30575 1726867645.82330: checking for any_errors_fatal 30575 1726867645.82338: done checking for any_errors_fatal 30575 1726867645.82339: checking for max_fail_percentage 30575 1726867645.82341: done checking for max_fail_percentage 30575 1726867645.82342: checking to see if all hosts have failed and the running result is not ok 30575 1726867645.82343: done checking to see if all hosts have failed 30575 1726867645.82343: getting the remaining hosts for this loop 30575 1726867645.82345: done getting the remaining hosts for this loop 30575 1726867645.82349: getting the next task for host managed_node3 30575 1726867645.82357: done getting next task for host managed_node3 30575 1726867645.82362: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867645.82367: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867645.82406: getting variables 30575 1726867645.82412: in VariableManager get_vars() 30575 1726867645.82465: Calling all_inventory to load vars for managed_node3 30575 1726867645.82468: Calling groups_inventory to load vars for managed_node3 30575 1726867645.82471: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867645.82596: Calling all_plugins_play to load vars for managed_node3 30575 1726867645.82600: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867645.82604: Calling groups_plugins_play to load vars for managed_node3 30575 1726867645.84853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867645.88248: done with get_vars() 30575 1726867645.88274: done getting variables 30575 1726867645.88337: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:25 -0400 (0:00:00.079) 0:01:21.261 ****** 30575 1726867645.88373: entering _queue_task() for managed_node3/fail 30575 1726867645.89046: worker is 1 (out of 1 available) 30575 1726867645.89059: exiting _queue_task() for managed_node3/fail 30575 1726867645.89072: done queuing things up, now waiting for results queue to drain 30575 1726867645.89074: waiting for pending results... 30575 1726867645.89597: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867645.89603: in run() - task 0affcac9-a3a5-e081-a588-0000000019c2 30575 1726867645.89606: variable 'ansible_search_path' from source: unknown 30575 1726867645.89609: variable 'ansible_search_path' from source: unknown 30575 1726867645.89612: calling self._execute() 30575 1726867645.89783: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867645.89787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867645.89790: variable 'omit' from source: magic vars 30575 1726867645.90185: variable 'ansible_distribution_major_version' from source: facts 30575 1726867645.90189: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867645.90200: variable 'network_state' from source: role '' defaults 30575 1726867645.90215: Evaluated conditional (network_state != {}): False 30575 1726867645.90218: when evaluation is False, skipping this task 30575 1726867645.90221: _execute() done 30575 1726867645.90224: dumping result to json 30575 1726867645.90226: done dumping result, returning 30575 1726867645.90236: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-0000000019c2] 30575 1726867645.90239: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c2 30575 1726867645.90549: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c2 30575 1726867645.90552: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867645.90627: no more pending results, returning what we have 30575 1726867645.90631: results queue empty 30575 1726867645.90632: checking for any_errors_fatal 30575 1726867645.90640: done checking for any_errors_fatal 30575 1726867645.90641: checking for max_fail_percentage 30575 1726867645.90643: done checking for max_fail_percentage 30575 1726867645.90644: checking to see if all hosts have failed and the running result is not ok 30575 1726867645.90645: done checking to see if all hosts have failed 30575 1726867645.90645: getting the remaining hosts for this loop 30575 1726867645.90647: done getting the remaining hosts for this loop 30575 1726867645.90651: getting the next task for host managed_node3 30575 1726867645.90659: done getting next task for host managed_node3 30575 1726867645.90663: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867645.90668: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867645.90697: getting variables 30575 1726867645.90699: in VariableManager get_vars() 30575 1726867645.90753: Calling all_inventory to load vars for managed_node3 30575 1726867645.90756: Calling groups_inventory to load vars for managed_node3 30575 1726867645.90759: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867645.90774: Calling all_plugins_play to load vars for managed_node3 30575 1726867645.91180: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867645.91186: Calling groups_plugins_play to load vars for managed_node3 30575 1726867645.94568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867645.96519: done with get_vars() 30575 1726867645.96545: done getting variables 30575 1726867645.96618: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:25 -0400 (0:00:00.082) 0:01:21.344 ****** 30575 1726867645.96657: entering _queue_task() for managed_node3/fail 30575 1726867645.97026: worker is 1 (out of 1 available) 30575 1726867645.97038: exiting _queue_task() for managed_node3/fail 30575 1726867645.97163: done queuing things up, now waiting for results queue to drain 30575 1726867645.97165: waiting for pending results... 30575 1726867645.97780: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867645.98046: in run() - task 0affcac9-a3a5-e081-a588-0000000019c3 30575 1726867645.98060: variable 'ansible_search_path' from source: unknown 30575 1726867645.98063: variable 'ansible_search_path' from source: unknown 30575 1726867645.98104: calling self._execute() 30575 1726867645.98381: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867645.98385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867645.98396: variable 'omit' from source: magic vars 30575 1726867645.99383: variable 'ansible_distribution_major_version' from source: facts 30575 1726867645.99387: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867645.99390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867646.02282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867646.02401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867646.02554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867646.02592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867646.02622: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867646.02911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.02958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.03173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.03221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.03236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.03483: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.03487: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867646.03490: variable 'ansible_distribution' from source: facts 30575 1726867646.03493: variable '__network_rh_distros' from source: role '' defaults 30575 1726867646.03496: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867646.03756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.03881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.03885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.03888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.03890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.03902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.03929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.03959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.03998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.04014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.04059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.04087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.04109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.04150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.04170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.04492: variable 'network_connections' from source: include params 30575 1726867646.04503: variable 'interface' from source: play vars 30575 1726867646.04570: variable 'interface' from source: play vars 30575 1726867646.04581: variable 'network_state' from source: role '' defaults 30575 1726867646.04658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867646.04833: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867646.04872: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867646.04905: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867646.04944: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867646.04986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867646.05007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867646.05042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.05068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867646.05093: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867646.05096: when evaluation is False, skipping this task 30575 1726867646.05099: _execute() done 30575 1726867646.05102: dumping result to json 30575 1726867646.05106: done dumping result, returning 30575 1726867646.05119: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-0000000019c3] 30575 1726867646.05122: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c3 30575 1726867646.05221: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c3 30575 1726867646.05223: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867646.05294: no more pending results, returning what we have 30575 1726867646.05298: results queue empty 30575 1726867646.05298: checking for any_errors_fatal 30575 1726867646.05306: done checking for any_errors_fatal 30575 1726867646.05306: checking for max_fail_percentage 30575 1726867646.05311: done checking for max_fail_percentage 30575 1726867646.05312: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.05313: done checking to see if all hosts have failed 30575 1726867646.05313: getting the remaining hosts for this loop 30575 1726867646.05316: done getting the remaining hosts for this loop 30575 1726867646.05320: getting the next task for host managed_node3 30575 1726867646.05330: done getting next task for host managed_node3 30575 1726867646.05335: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867646.05339: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.05367: getting variables 30575 1726867646.05370: in VariableManager get_vars() 30575 1726867646.05424: Calling all_inventory to load vars for managed_node3 30575 1726867646.05427: Calling groups_inventory to load vars for managed_node3 30575 1726867646.05429: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.05442: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.05445: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.05448: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.08266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.10457: done with get_vars() 30575 1726867646.10680: done getting variables 30575 1726867646.10742: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:26 -0400 (0:00:00.141) 0:01:21.485 ****** 30575 1726867646.10775: entering _queue_task() for managed_node3/dnf 30575 1726867646.11516: worker is 1 (out of 1 available) 30575 1726867646.11527: exiting _queue_task() for managed_node3/dnf 30575 1726867646.11538: done queuing things up, now waiting for results queue to drain 30575 1726867646.11540: waiting for pending results... 30575 1726867646.11996: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867646.12584: in run() - task 0affcac9-a3a5-e081-a588-0000000019c4 30575 1726867646.12589: variable 'ansible_search_path' from source: unknown 30575 1726867646.12592: variable 'ansible_search_path' from source: unknown 30575 1726867646.12596: calling self._execute() 30575 1726867646.12600: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.12603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.12607: variable 'omit' from source: magic vars 30575 1726867646.13384: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.13404: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.13884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867646.18110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867646.18191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867646.18238: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867646.18288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867646.18324: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867646.18422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.18533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.18567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.18734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.18759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.19050: variable 'ansible_distribution' from source: facts 30575 1726867646.19060: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.19091: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867646.19578: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867646.19721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.19912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.20000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.20016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.20100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.20103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.20127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.20216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.20318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.20322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.20324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.20355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.20381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.20418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.20431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.20596: variable 'network_connections' from source: include params 30575 1726867646.20610: variable 'interface' from source: play vars 30575 1726867646.20751: variable 'interface' from source: play vars 30575 1726867646.20754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867646.20947: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867646.20979: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867646.21013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867646.21041: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867646.21084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867646.21109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867646.21171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.21175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867646.21203: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867646.21462: variable 'network_connections' from source: include params 30575 1726867646.21466: variable 'interface' from source: play vars 30575 1726867646.21582: variable 'interface' from source: play vars 30575 1726867646.21585: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867646.21587: when evaluation is False, skipping this task 30575 1726867646.21589: _execute() done 30575 1726867646.21591: dumping result to json 30575 1726867646.21593: done dumping result, returning 30575 1726867646.21595: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000019c4] 30575 1726867646.21597: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c4 30575 1726867646.21672: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c4 30575 1726867646.21675: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867646.21757: no more pending results, returning what we have 30575 1726867646.21761: results queue empty 30575 1726867646.21762: checking for any_errors_fatal 30575 1726867646.21768: done checking for any_errors_fatal 30575 1726867646.21769: checking for max_fail_percentage 30575 1726867646.21771: done checking for max_fail_percentage 30575 1726867646.21772: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.21773: done checking to see if all hosts have failed 30575 1726867646.21773: getting the remaining hosts for this loop 30575 1726867646.21775: done getting the remaining hosts for this loop 30575 1726867646.21882: getting the next task for host managed_node3 30575 1726867646.21891: done getting next task for host managed_node3 30575 1726867646.21895: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867646.21899: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.21921: getting variables 30575 1726867646.21922: in VariableManager get_vars() 30575 1726867646.21974: Calling all_inventory to load vars for managed_node3 30575 1726867646.21976: Calling groups_inventory to load vars for managed_node3 30575 1726867646.22265: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.22274: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.22279: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.22281: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.24067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.26237: done with get_vars() 30575 1726867646.26264: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867646.26402: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:26 -0400 (0:00:00.156) 0:01:21.641 ****** 30575 1726867646.26437: entering _queue_task() for managed_node3/yum 30575 1726867646.27319: worker is 1 (out of 1 available) 30575 1726867646.27337: exiting _queue_task() for managed_node3/yum 30575 1726867646.27350: done queuing things up, now waiting for results queue to drain 30575 1726867646.27352: waiting for pending results... 30575 1726867646.27894: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867646.27902: in run() - task 0affcac9-a3a5-e081-a588-0000000019c5 30575 1726867646.27906: variable 'ansible_search_path' from source: unknown 30575 1726867646.27912: variable 'ansible_search_path' from source: unknown 30575 1726867646.27915: calling self._execute() 30575 1726867646.27935: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.27946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.27959: variable 'omit' from source: magic vars 30575 1726867646.28346: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.28363: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.28532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867646.32234: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867646.32304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867646.32344: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867646.32392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867646.32426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867646.32512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.32914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.32950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.33001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.33022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.33123: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.33185: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867646.33189: when evaluation is False, skipping this task 30575 1726867646.33192: _execute() done 30575 1726867646.33194: dumping result to json 30575 1726867646.33263: done dumping result, returning 30575 1726867646.33267: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000019c5] 30575 1726867646.33269: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c5 30575 1726867646.33338: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c5 30575 1726867646.33340: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867646.33413: no more pending results, returning what we have 30575 1726867646.33417: results queue empty 30575 1726867646.33418: checking for any_errors_fatal 30575 1726867646.33426: done checking for any_errors_fatal 30575 1726867646.33427: checking for max_fail_percentage 30575 1726867646.33429: done checking for max_fail_percentage 30575 1726867646.33431: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.33432: done checking to see if all hosts have failed 30575 1726867646.33433: getting the remaining hosts for this loop 30575 1726867646.33435: done getting the remaining hosts for this loop 30575 1726867646.33440: getting the next task for host managed_node3 30575 1726867646.33450: done getting next task for host managed_node3 30575 1726867646.33455: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867646.33459: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.33512: getting variables 30575 1726867646.33514: in VariableManager get_vars() 30575 1726867646.33559: Calling all_inventory to load vars for managed_node3 30575 1726867646.33562: Calling groups_inventory to load vars for managed_node3 30575 1726867646.33565: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.33702: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.33706: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.33713: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.35835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.36730: done with get_vars() 30575 1726867646.36746: done getting variables 30575 1726867646.36792: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:26 -0400 (0:00:00.103) 0:01:21.745 ****** 30575 1726867646.36819: entering _queue_task() for managed_node3/fail 30575 1726867646.37042: worker is 1 (out of 1 available) 30575 1726867646.37057: exiting _queue_task() for managed_node3/fail 30575 1726867646.37069: done queuing things up, now waiting for results queue to drain 30575 1726867646.37071: waiting for pending results... 30575 1726867646.37268: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867646.37488: in run() - task 0affcac9-a3a5-e081-a588-0000000019c6 30575 1726867646.37492: variable 'ansible_search_path' from source: unknown 30575 1726867646.37495: variable 'ansible_search_path' from source: unknown 30575 1726867646.37499: calling self._execute() 30575 1726867646.37513: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.37520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.37530: variable 'omit' from source: magic vars 30575 1726867646.37829: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.37838: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.37924: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867646.38054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867646.39667: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867646.39714: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867646.39739: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867646.39764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867646.39787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867646.39845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.39876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.39898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.39926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.39939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.39971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.39989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.40010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.40035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.40045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.40073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.40090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.40109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.40134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.40144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.40255: variable 'network_connections' from source: include params 30575 1726867646.40264: variable 'interface' from source: play vars 30575 1726867646.40313: variable 'interface' from source: play vars 30575 1726867646.40362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867646.40481: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867646.40511: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867646.40532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867646.40556: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867646.40587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867646.40602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867646.40621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.40638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867646.40680: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867646.40824: variable 'network_connections' from source: include params 30575 1726867646.40827: variable 'interface' from source: play vars 30575 1726867646.40871: variable 'interface' from source: play vars 30575 1726867646.40893: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867646.40897: when evaluation is False, skipping this task 30575 1726867646.40900: _execute() done 30575 1726867646.40902: dumping result to json 30575 1726867646.40904: done dumping result, returning 30575 1726867646.40913: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000019c6] 30575 1726867646.40918: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c6 30575 1726867646.41005: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c6 30575 1726867646.41010: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867646.41058: no more pending results, returning what we have 30575 1726867646.41062: results queue empty 30575 1726867646.41062: checking for any_errors_fatal 30575 1726867646.41067: done checking for any_errors_fatal 30575 1726867646.41068: checking for max_fail_percentage 30575 1726867646.41069: done checking for max_fail_percentage 30575 1726867646.41070: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.41071: done checking to see if all hosts have failed 30575 1726867646.41072: getting the remaining hosts for this loop 30575 1726867646.41073: done getting the remaining hosts for this loop 30575 1726867646.41079: getting the next task for host managed_node3 30575 1726867646.41087: done getting next task for host managed_node3 30575 1726867646.41091: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867646.41096: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.41123: getting variables 30575 1726867646.41125: in VariableManager get_vars() 30575 1726867646.41166: Calling all_inventory to load vars for managed_node3 30575 1726867646.41169: Calling groups_inventory to load vars for managed_node3 30575 1726867646.41171: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.41184: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.41187: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.41190: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.42005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.42881: done with get_vars() 30575 1726867646.42898: done getting variables 30575 1726867646.42946: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:26 -0400 (0:00:00.061) 0:01:21.807 ****** 30575 1726867646.42972: entering _queue_task() for managed_node3/package 30575 1726867646.43240: worker is 1 (out of 1 available) 30575 1726867646.43254: exiting _queue_task() for managed_node3/package 30575 1726867646.43267: done queuing things up, now waiting for results queue to drain 30575 1726867646.43269: waiting for pending results... 30575 1726867646.43472: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867646.43582: in run() - task 0affcac9-a3a5-e081-a588-0000000019c7 30575 1726867646.43593: variable 'ansible_search_path' from source: unknown 30575 1726867646.43599: variable 'ansible_search_path' from source: unknown 30575 1726867646.43630: calling self._execute() 30575 1726867646.43719: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.43723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.43732: variable 'omit' from source: magic vars 30575 1726867646.44022: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.44032: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.44168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867646.44362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867646.44397: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867646.44423: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867646.44476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867646.44557: variable 'network_packages' from source: role '' defaults 30575 1726867646.44636: variable '__network_provider_setup' from source: role '' defaults 30575 1726867646.44882: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867646.44885: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867646.44888: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867646.44890: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867646.44942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867646.46994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867646.47046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867646.47075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867646.47099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867646.47122: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867646.47181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.47202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.47222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.47248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.47259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.47295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.47313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.47329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.47353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.47363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.47506: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867646.47580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.47598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.47619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.47643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.47654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.47717: variable 'ansible_python' from source: facts 30575 1726867646.47730: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867646.47786: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867646.47844: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867646.47928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.47945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.47961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.47987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.47998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.48032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.48053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.48070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.48096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.48106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.48202: variable 'network_connections' from source: include params 30575 1726867646.48205: variable 'interface' from source: play vars 30575 1726867646.48276: variable 'interface' from source: play vars 30575 1726867646.48329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867646.48347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867646.48367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.48393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867646.48430: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867646.48606: variable 'network_connections' from source: include params 30575 1726867646.48613: variable 'interface' from source: play vars 30575 1726867646.48680: variable 'interface' from source: play vars 30575 1726867646.48703: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867646.48758: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867646.48953: variable 'network_connections' from source: include params 30575 1726867646.48956: variable 'interface' from source: play vars 30575 1726867646.49003: variable 'interface' from source: play vars 30575 1726867646.49022: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867646.49074: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867646.49267: variable 'network_connections' from source: include params 30575 1726867646.49270: variable 'interface' from source: play vars 30575 1726867646.49319: variable 'interface' from source: play vars 30575 1726867646.49358: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867646.49398: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867646.49404: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867646.49447: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867646.49579: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867646.49879: variable 'network_connections' from source: include params 30575 1726867646.49882: variable 'interface' from source: play vars 30575 1726867646.49928: variable 'interface' from source: play vars 30575 1726867646.49934: variable 'ansible_distribution' from source: facts 30575 1726867646.49937: variable '__network_rh_distros' from source: role '' defaults 30575 1726867646.49943: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.49954: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867646.50061: variable 'ansible_distribution' from source: facts 30575 1726867646.50064: variable '__network_rh_distros' from source: role '' defaults 30575 1726867646.50069: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.50082: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867646.50185: variable 'ansible_distribution' from source: facts 30575 1726867646.50189: variable '__network_rh_distros' from source: role '' defaults 30575 1726867646.50194: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.50223: variable 'network_provider' from source: set_fact 30575 1726867646.50234: variable 'ansible_facts' from source: unknown 30575 1726867646.50591: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867646.50594: when evaluation is False, skipping this task 30575 1726867646.50597: _execute() done 30575 1726867646.50599: dumping result to json 30575 1726867646.50603: done dumping result, returning 30575 1726867646.50614: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-0000000019c7] 30575 1726867646.50619: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c7 30575 1726867646.50709: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c7 30575 1726867646.50712: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867646.50758: no more pending results, returning what we have 30575 1726867646.50762: results queue empty 30575 1726867646.50762: checking for any_errors_fatal 30575 1726867646.50769: done checking for any_errors_fatal 30575 1726867646.50770: checking for max_fail_percentage 30575 1726867646.50771: done checking for max_fail_percentage 30575 1726867646.50772: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.50773: done checking to see if all hosts have failed 30575 1726867646.50774: getting the remaining hosts for this loop 30575 1726867646.50775: done getting the remaining hosts for this loop 30575 1726867646.50781: getting the next task for host managed_node3 30575 1726867646.50789: done getting next task for host managed_node3 30575 1726867646.50793: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867646.50798: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.50822: getting variables 30575 1726867646.50823: in VariableManager get_vars() 30575 1726867646.50869: Calling all_inventory to load vars for managed_node3 30575 1726867646.50872: Calling groups_inventory to load vars for managed_node3 30575 1726867646.50874: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.50888: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.50891: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.50894: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.51819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.52668: done with get_vars() 30575 1726867646.52685: done getting variables 30575 1726867646.52728: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:26 -0400 (0:00:00.097) 0:01:21.905 ****** 30575 1726867646.52752: entering _queue_task() for managed_node3/package 30575 1726867646.52979: worker is 1 (out of 1 available) 30575 1726867646.52992: exiting _queue_task() for managed_node3/package 30575 1726867646.53004: done queuing things up, now waiting for results queue to drain 30575 1726867646.53006: waiting for pending results... 30575 1726867646.53187: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867646.53311: in run() - task 0affcac9-a3a5-e081-a588-0000000019c8 30575 1726867646.53321: variable 'ansible_search_path' from source: unknown 30575 1726867646.53325: variable 'ansible_search_path' from source: unknown 30575 1726867646.53356: calling self._execute() 30575 1726867646.53427: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.53431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.53440: variable 'omit' from source: magic vars 30575 1726867646.53720: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.53728: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.53815: variable 'network_state' from source: role '' defaults 30575 1726867646.53824: Evaluated conditional (network_state != {}): False 30575 1726867646.53827: when evaluation is False, skipping this task 30575 1726867646.53829: _execute() done 30575 1726867646.53832: dumping result to json 30575 1726867646.53837: done dumping result, returning 30575 1726867646.53845: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000019c8] 30575 1726867646.53850: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c8 30575 1726867646.53939: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c8 30575 1726867646.53941: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867646.53989: no more pending results, returning what we have 30575 1726867646.53993: results queue empty 30575 1726867646.53994: checking for any_errors_fatal 30575 1726867646.54002: done checking for any_errors_fatal 30575 1726867646.54003: checking for max_fail_percentage 30575 1726867646.54004: done checking for max_fail_percentage 30575 1726867646.54005: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.54006: done checking to see if all hosts have failed 30575 1726867646.54009: getting the remaining hosts for this loop 30575 1726867646.54010: done getting the remaining hosts for this loop 30575 1726867646.54014: getting the next task for host managed_node3 30575 1726867646.54021: done getting next task for host managed_node3 30575 1726867646.54025: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867646.54030: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.54049: getting variables 30575 1726867646.54053: in VariableManager get_vars() 30575 1726867646.54090: Calling all_inventory to load vars for managed_node3 30575 1726867646.54093: Calling groups_inventory to load vars for managed_node3 30575 1726867646.54095: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.54103: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.54105: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.54110: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.54866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.55857: done with get_vars() 30575 1726867646.55880: done getting variables 30575 1726867646.55933: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:26 -0400 (0:00:00.032) 0:01:21.937 ****** 30575 1726867646.55967: entering _queue_task() for managed_node3/package 30575 1726867646.56224: worker is 1 (out of 1 available) 30575 1726867646.56238: exiting _queue_task() for managed_node3/package 30575 1726867646.56250: done queuing things up, now waiting for results queue to drain 30575 1726867646.56252: waiting for pending results... 30575 1726867646.56697: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867646.56702: in run() - task 0affcac9-a3a5-e081-a588-0000000019c9 30575 1726867646.56706: variable 'ansible_search_path' from source: unknown 30575 1726867646.56709: variable 'ansible_search_path' from source: unknown 30575 1726867646.56727: calling self._execute() 30575 1726867646.56823: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.56835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.56849: variable 'omit' from source: magic vars 30575 1726867646.57173: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.57183: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.57263: variable 'network_state' from source: role '' defaults 30575 1726867646.57272: Evaluated conditional (network_state != {}): False 30575 1726867646.57276: when evaluation is False, skipping this task 30575 1726867646.57280: _execute() done 30575 1726867646.57283: dumping result to json 30575 1726867646.57287: done dumping result, returning 30575 1726867646.57295: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000019c9] 30575 1726867646.57299: sending task result for task 0affcac9-a3a5-e081-a588-0000000019c9 30575 1726867646.57389: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019c9 30575 1726867646.57392: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867646.57438: no more pending results, returning what we have 30575 1726867646.57441: results queue empty 30575 1726867646.57442: checking for any_errors_fatal 30575 1726867646.57447: done checking for any_errors_fatal 30575 1726867646.57447: checking for max_fail_percentage 30575 1726867646.57449: done checking for max_fail_percentage 30575 1726867646.57450: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.57451: done checking to see if all hosts have failed 30575 1726867646.57451: getting the remaining hosts for this loop 30575 1726867646.57453: done getting the remaining hosts for this loop 30575 1726867646.57456: getting the next task for host managed_node3 30575 1726867646.57463: done getting next task for host managed_node3 30575 1726867646.57466: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867646.57471: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.57492: getting variables 30575 1726867646.57494: in VariableManager get_vars() 30575 1726867646.57527: Calling all_inventory to load vars for managed_node3 30575 1726867646.57530: Calling groups_inventory to load vars for managed_node3 30575 1726867646.57532: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.57539: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.57541: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.57544: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.58267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.59503: done with get_vars() 30575 1726867646.59522: done getting variables 30575 1726867646.59570: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:26 -0400 (0:00:00.036) 0:01:21.973 ****** 30575 1726867646.59603: entering _queue_task() for managed_node3/service 30575 1726867646.59844: worker is 1 (out of 1 available) 30575 1726867646.59854: exiting _queue_task() for managed_node3/service 30575 1726867646.59866: done queuing things up, now waiting for results queue to drain 30575 1726867646.59867: waiting for pending results... 30575 1726867646.60197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867646.60263: in run() - task 0affcac9-a3a5-e081-a588-0000000019ca 30575 1726867646.60444: variable 'ansible_search_path' from source: unknown 30575 1726867646.60452: variable 'ansible_search_path' from source: unknown 30575 1726867646.60456: calling self._execute() 30575 1726867646.60459: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.60462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.60465: variable 'omit' from source: magic vars 30575 1726867646.60952: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.60963: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.61183: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867646.61275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867646.63497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867646.63557: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867646.63596: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867646.63628: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867646.63657: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867646.63736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.63779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.63816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.63854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.63869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.63921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.63945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.63968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.64368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.64371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.64374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.64376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.64383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.64386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.64643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.64761: variable 'network_connections' from source: include params 30575 1726867646.64772: variable 'interface' from source: play vars 30575 1726867646.64955: variable 'interface' from source: play vars 30575 1726867646.65127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867646.65456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867646.65496: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867646.65529: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867646.65883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867646.65886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867646.65889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867646.65891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.66027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867646.66080: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867646.66554: variable 'network_connections' from source: include params 30575 1726867646.66557: variable 'interface' from source: play vars 30575 1726867646.66645: variable 'interface' from source: play vars 30575 1726867646.66668: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867646.66672: when evaluation is False, skipping this task 30575 1726867646.66674: _execute() done 30575 1726867646.66679: dumping result to json 30575 1726867646.66815: done dumping result, returning 30575 1726867646.66825: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000019ca] 30575 1726867646.66831: sending task result for task 0affcac9-a3a5-e081-a588-0000000019ca 30575 1726867646.66941: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019ca skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867646.67023: no more pending results, returning what we have 30575 1726867646.67027: results queue empty 30575 1726867646.67028: checking for any_errors_fatal 30575 1726867646.67037: done checking for any_errors_fatal 30575 1726867646.67038: checking for max_fail_percentage 30575 1726867646.67040: done checking for max_fail_percentage 30575 1726867646.67041: checking to see if all hosts have failed and the running result is not ok 30575 1726867646.67042: done checking to see if all hosts have failed 30575 1726867646.67043: getting the remaining hosts for this loop 30575 1726867646.67045: done getting the remaining hosts for this loop 30575 1726867646.67049: getting the next task for host managed_node3 30575 1726867646.67060: done getting next task for host managed_node3 30575 1726867646.67065: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867646.67070: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867646.67101: getting variables 30575 1726867646.67104: in VariableManager get_vars() 30575 1726867646.67152: Calling all_inventory to load vars for managed_node3 30575 1726867646.67155: Calling groups_inventory to load vars for managed_node3 30575 1726867646.67158: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867646.67169: Calling all_plugins_play to load vars for managed_node3 30575 1726867646.67173: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867646.67479: Calling groups_plugins_play to load vars for managed_node3 30575 1726867646.68294: WORKER PROCESS EXITING 30575 1726867646.69788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867646.71702: done with get_vars() 30575 1726867646.71724: done getting variables 30575 1726867646.71790: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:26 -0400 (0:00:00.122) 0:01:22.095 ****** 30575 1726867646.71824: entering _queue_task() for managed_node3/service 30575 1726867646.72274: worker is 1 (out of 1 available) 30575 1726867646.72402: exiting _queue_task() for managed_node3/service 30575 1726867646.72414: done queuing things up, now waiting for results queue to drain 30575 1726867646.72416: waiting for pending results... 30575 1726867646.72898: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867646.72962: in run() - task 0affcac9-a3a5-e081-a588-0000000019cb 30575 1726867646.73012: variable 'ansible_search_path' from source: unknown 30575 1726867646.73210: variable 'ansible_search_path' from source: unknown 30575 1726867646.73214: calling self._execute() 30575 1726867646.73319: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.73354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.73536: variable 'omit' from source: magic vars 30575 1726867646.74270: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.74332: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867646.74742: variable 'network_provider' from source: set_fact 30575 1726867646.74758: variable 'network_state' from source: role '' defaults 30575 1726867646.74773: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867646.74787: variable 'omit' from source: magic vars 30575 1726867646.74907: variable 'omit' from source: magic vars 30575 1726867646.75013: variable 'network_service_name' from source: role '' defaults 30575 1726867646.75283: variable 'network_service_name' from source: role '' defaults 30575 1726867646.75431: variable '__network_provider_setup' from source: role '' defaults 30575 1726867646.75442: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867646.75584: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867646.75600: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867646.75670: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867646.76091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867646.79718: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867646.79830: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867646.79871: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867646.79958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867646.80049: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867646.80252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.80291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.80379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.80426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.80476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.80615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.80644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.80705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.80826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.80847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.81387: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867646.81522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.81559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.81592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.81635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.81663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.81756: variable 'ansible_python' from source: facts 30575 1726867646.81786: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867646.81872: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867646.81985: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867646.82100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.82132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.82161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.82214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.82234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.82286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867646.82422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867646.82428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.82431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867646.82434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867646.82573: variable 'network_connections' from source: include params 30575 1726867646.82590: variable 'interface' from source: play vars 30575 1726867646.82674: variable 'interface' from source: play vars 30575 1726867646.82794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867646.82991: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867646.83044: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867646.83101: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867646.83220: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867646.83309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867646.83344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867646.83384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867646.83484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867646.83488: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867646.84275: variable 'network_connections' from source: include params 30575 1726867646.84281: variable 'interface' from source: play vars 30575 1726867646.84284: variable 'interface' from source: play vars 30575 1726867646.84414: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867646.84593: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867646.85019: variable 'network_connections' from source: include params 30575 1726867646.85038: variable 'interface' from source: play vars 30575 1726867646.85132: variable 'interface' from source: play vars 30575 1726867646.85145: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867646.85224: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867646.85526: variable 'network_connections' from source: include params 30575 1726867646.85535: variable 'interface' from source: play vars 30575 1726867646.85612: variable 'interface' from source: play vars 30575 1726867646.85668: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867646.85733: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867646.85743: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867646.85845: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867646.86065: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867646.86594: variable 'network_connections' from source: include params 30575 1726867646.86604: variable 'interface' from source: play vars 30575 1726867646.86672: variable 'interface' from source: play vars 30575 1726867646.86688: variable 'ansible_distribution' from source: facts 30575 1726867646.86753: variable '__network_rh_distros' from source: role '' defaults 30575 1726867646.86759: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.86764: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867646.86914: variable 'ansible_distribution' from source: facts 30575 1726867646.86923: variable '__network_rh_distros' from source: role '' defaults 30575 1726867646.86933: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.86951: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867646.87132: variable 'ansible_distribution' from source: facts 30575 1726867646.87142: variable '__network_rh_distros' from source: role '' defaults 30575 1726867646.87153: variable 'ansible_distribution_major_version' from source: facts 30575 1726867646.87198: variable 'network_provider' from source: set_fact 30575 1726867646.87293: variable 'omit' from source: magic vars 30575 1726867646.87296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867646.87299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867646.87316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867646.87338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867646.87354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867646.87389: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867646.87404: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.87413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.87522: Set connection var ansible_pipelining to False 30575 1726867646.87533: Set connection var ansible_shell_type to sh 30575 1726867646.87544: Set connection var ansible_shell_executable to /bin/sh 30575 1726867646.87553: Set connection var ansible_timeout to 10 30575 1726867646.87563: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867646.87619: Set connection var ansible_connection to ssh 30575 1726867646.87622: variable 'ansible_shell_executable' from source: unknown 30575 1726867646.87625: variable 'ansible_connection' from source: unknown 30575 1726867646.87627: variable 'ansible_module_compression' from source: unknown 30575 1726867646.87629: variable 'ansible_shell_type' from source: unknown 30575 1726867646.87631: variable 'ansible_shell_executable' from source: unknown 30575 1726867646.87632: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867646.87641: variable 'ansible_pipelining' from source: unknown 30575 1726867646.87648: variable 'ansible_timeout' from source: unknown 30575 1726867646.87657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867646.87762: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867646.87837: variable 'omit' from source: magic vars 30575 1726867646.87840: starting attempt loop 30575 1726867646.87843: running the handler 30575 1726867646.87881: variable 'ansible_facts' from source: unknown 30575 1726867646.88667: _low_level_execute_command(): starting 30575 1726867646.88682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867646.89355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867646.89370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867646.89388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867646.89406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867646.89423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867646.89435: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867646.89456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867646.89549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867646.89570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867646.89666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867646.91364: stdout chunk (state=3): >>>/root <<< 30575 1726867646.91516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867646.91519: stdout chunk (state=3): >>><<< 30575 1726867646.91522: stderr chunk (state=3): >>><<< 30575 1726867646.91541: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867646.91635: _low_level_execute_command(): starting 30575 1726867646.91640: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889 `" && echo ansible-tmp-1726867646.9154687-34480-100247047461889="` echo /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889 `" ) && sleep 0' 30575 1726867646.92295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867646.92320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867646.92336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867646.92356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867646.92439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867646.94402: stdout chunk (state=3): >>>ansible-tmp-1726867646.9154687-34480-100247047461889=/root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889 <<< 30575 1726867646.94467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867646.94646: stderr chunk (state=3): >>><<< 30575 1726867646.94696: stdout chunk (state=3): >>><<< 30575 1726867646.94714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867646.9154687-34480-100247047461889=/root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867646.94787: variable 'ansible_module_compression' from source: unknown 30575 1726867646.94926: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867646.94996: variable 'ansible_facts' from source: unknown 30575 1726867646.95301: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/AnsiballZ_systemd.py 30575 1726867646.95429: Sending initial data 30575 1726867646.95432: Sent initial data (156 bytes) 30575 1726867646.96072: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867646.96076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867646.96180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867646.96238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867646.97933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867646.97973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867646.98032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_jloyqh7 /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/AnsiballZ_systemd.py <<< 30575 1726867646.98035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/AnsiballZ_systemd.py" <<< 30575 1726867646.98151: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_jloyqh7" to remote "/root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/AnsiballZ_systemd.py" <<< 30575 1726867647.01246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867647.01249: stdout chunk (state=3): >>><<< 30575 1726867647.01252: stderr chunk (state=3): >>><<< 30575 1726867647.01255: done transferring module to remote 30575 1726867647.01257: _low_level_execute_command(): starting 30575 1726867647.01259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/ /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/AnsiballZ_systemd.py && sleep 0' 30575 1726867647.02389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867647.02599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867647.02710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867647.04880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867647.04884: stdout chunk (state=3): >>><<< 30575 1726867647.04891: stderr chunk (state=3): >>><<< 30575 1726867647.04897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867647.04900: _low_level_execute_command(): starting 30575 1726867647.04902: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/AnsiballZ_systemd.py && sleep 0' 30575 1726867647.06028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867647.06127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867647.06140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867647.06214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867647.35567: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310809088", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1928269000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30575 1726867647.35582: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system<<< 30575 1726867647.35586: stdout chunk (state=3): >>>.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867647.37420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867647.37423: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867647.37426: stderr chunk (state=3): >>><<< 30575 1726867647.37493: stdout chunk (state=3): >>><<< 30575 1726867647.37518: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310809088", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1928269000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867647.37887: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867647.37890: _low_level_execute_command(): starting 30575 1726867647.37893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867646.9154687-34480-100247047461889/ > /dev/null 2>&1 && sleep 0' 30575 1726867647.39100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867647.39327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867647.39336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867647.39408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867647.41295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867647.41383: stderr chunk (state=3): >>><<< 30575 1726867647.41494: stdout chunk (state=3): >>><<< 30575 1726867647.41513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867647.41521: handler run complete 30575 1726867647.41583: attempt loop complete, returning result 30575 1726867647.41587: _execute() done 30575 1726867647.41589: dumping result to json 30575 1726867647.41608: done dumping result, returning 30575 1726867647.41653: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-0000000019cb] 30575 1726867647.41656: sending task result for task 0affcac9-a3a5-e081-a588-0000000019cb 30575 1726867647.42079: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019cb 30575 1726867647.42082: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867647.42337: no more pending results, returning what we have 30575 1726867647.42340: results queue empty 30575 1726867647.42341: checking for any_errors_fatal 30575 1726867647.42347: done checking for any_errors_fatal 30575 1726867647.42348: checking for max_fail_percentage 30575 1726867647.42349: done checking for max_fail_percentage 30575 1726867647.42350: checking to see if all hosts have failed and the running result is not ok 30575 1726867647.42351: done checking to see if all hosts have failed 30575 1726867647.42352: getting the remaining hosts for this loop 30575 1726867647.42353: done getting the remaining hosts for this loop 30575 1726867647.42357: getting the next task for host managed_node3 30575 1726867647.42365: done getting next task for host managed_node3 30575 1726867647.42369: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867647.42374: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867647.42389: getting variables 30575 1726867647.42391: in VariableManager get_vars() 30575 1726867647.42455: Calling all_inventory to load vars for managed_node3 30575 1726867647.42458: Calling groups_inventory to load vars for managed_node3 30575 1726867647.42460: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867647.42470: Calling all_plugins_play to load vars for managed_node3 30575 1726867647.42472: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867647.42474: Calling groups_plugins_play to load vars for managed_node3 30575 1726867647.45255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867647.48693: done with get_vars() 30575 1726867647.48724: done getting variables 30575 1726867647.48886: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:27 -0400 (0:00:00.771) 0:01:22.866 ****** 30575 1726867647.48933: entering _queue_task() for managed_node3/service 30575 1726867647.49722: worker is 1 (out of 1 available) 30575 1726867647.49737: exiting _queue_task() for managed_node3/service 30575 1726867647.49751: done queuing things up, now waiting for results queue to drain 30575 1726867647.49752: waiting for pending results... 30575 1726867647.50395: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867647.50439: in run() - task 0affcac9-a3a5-e081-a588-0000000019cc 30575 1726867647.50453: variable 'ansible_search_path' from source: unknown 30575 1726867647.50883: variable 'ansible_search_path' from source: unknown 30575 1726867647.50886: calling self._execute() 30575 1726867647.50889: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867647.50892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867647.50894: variable 'omit' from source: magic vars 30575 1726867647.51452: variable 'ansible_distribution_major_version' from source: facts 30575 1726867647.51695: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867647.51819: variable 'network_provider' from source: set_fact 30575 1726867647.52083: Evaluated conditional (network_provider == "nm"): True 30575 1726867647.52087: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867647.52179: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867647.52562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867647.63948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867647.64019: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867647.64062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867647.64098: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867647.64132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867647.64218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867647.64256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867647.64287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867647.64332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867647.64355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867647.64404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867647.64435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867647.64483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867647.64515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867647.64534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867647.64682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867647.64685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867647.64687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867647.64690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867647.64696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867647.64836: variable 'network_connections' from source: include params 30575 1726867647.64851: variable 'interface' from source: play vars 30575 1726867647.64933: variable 'interface' from source: play vars 30575 1726867647.65011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867647.65175: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867647.65223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867647.65258: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867647.65292: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867647.65343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867647.65369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867647.65400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867647.65436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867647.65476: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867647.65743: variable 'network_connections' from source: include params 30575 1726867647.65761: variable 'interface' from source: play vars 30575 1726867647.65870: variable 'interface' from source: play vars 30575 1726867647.65873: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867647.65875: when evaluation is False, skipping this task 30575 1726867647.65879: _execute() done 30575 1726867647.65881: dumping result to json 30575 1726867647.65884: done dumping result, returning 30575 1726867647.65895: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-0000000019cc] 30575 1726867647.65914: sending task result for task 0affcac9-a3a5-e081-a588-0000000019cc 30575 1726867647.66252: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019cc 30575 1726867647.66255: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867647.66305: no more pending results, returning what we have 30575 1726867647.66311: results queue empty 30575 1726867647.66312: checking for any_errors_fatal 30575 1726867647.66330: done checking for any_errors_fatal 30575 1726867647.66331: checking for max_fail_percentage 30575 1726867647.66333: done checking for max_fail_percentage 30575 1726867647.66334: checking to see if all hosts have failed and the running result is not ok 30575 1726867647.66335: done checking to see if all hosts have failed 30575 1726867647.66336: getting the remaining hosts for this loop 30575 1726867647.66337: done getting the remaining hosts for this loop 30575 1726867647.66341: getting the next task for host managed_node3 30575 1726867647.66350: done getting next task for host managed_node3 30575 1726867647.66354: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867647.66359: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867647.66384: getting variables 30575 1726867647.66386: in VariableManager get_vars() 30575 1726867647.66432: Calling all_inventory to load vars for managed_node3 30575 1726867647.66434: Calling groups_inventory to load vars for managed_node3 30575 1726867647.66437: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867647.66446: Calling all_plugins_play to load vars for managed_node3 30575 1726867647.66449: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867647.66452: Calling groups_plugins_play to load vars for managed_node3 30575 1726867647.80946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867647.85735: done with get_vars() 30575 1726867647.85774: done getting variables 30575 1726867647.85832: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:27 -0400 (0:00:00.369) 0:01:23.236 ****** 30575 1726867647.85863: entering _queue_task() for managed_node3/service 30575 1726867647.87053: worker is 1 (out of 1 available) 30575 1726867647.87066: exiting _queue_task() for managed_node3/service 30575 1726867647.87482: done queuing things up, now waiting for results queue to drain 30575 1726867647.87484: waiting for pending results... 30575 1726867647.87900: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867647.87935: in run() - task 0affcac9-a3a5-e081-a588-0000000019cd 30575 1726867647.88002: variable 'ansible_search_path' from source: unknown 30575 1726867647.88008: variable 'ansible_search_path' from source: unknown 30575 1726867647.88040: calling self._execute() 30575 1726867647.88248: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867647.88255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867647.88385: variable 'omit' from source: magic vars 30575 1726867647.89187: variable 'ansible_distribution_major_version' from source: facts 30575 1726867647.89198: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867647.89438: variable 'network_provider' from source: set_fact 30575 1726867647.89443: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867647.89447: when evaluation is False, skipping this task 30575 1726867647.89451: _execute() done 30575 1726867647.89571: dumping result to json 30575 1726867647.89578: done dumping result, returning 30575 1726867647.89657: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-0000000019cd] 30575 1726867647.89660: sending task result for task 0affcac9-a3a5-e081-a588-0000000019cd 30575 1726867647.89737: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019cd 30575 1726867647.89740: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867647.89805: no more pending results, returning what we have 30575 1726867647.89811: results queue empty 30575 1726867647.89812: checking for any_errors_fatal 30575 1726867647.89822: done checking for any_errors_fatal 30575 1726867647.89823: checking for max_fail_percentage 30575 1726867647.89825: done checking for max_fail_percentage 30575 1726867647.89825: checking to see if all hosts have failed and the running result is not ok 30575 1726867647.89826: done checking to see if all hosts have failed 30575 1726867647.89827: getting the remaining hosts for this loop 30575 1726867647.89829: done getting the remaining hosts for this loop 30575 1726867647.89832: getting the next task for host managed_node3 30575 1726867647.89840: done getting next task for host managed_node3 30575 1726867647.89844: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867647.89849: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867647.89875: getting variables 30575 1726867647.89879: in VariableManager get_vars() 30575 1726867647.89926: Calling all_inventory to load vars for managed_node3 30575 1726867647.89930: Calling groups_inventory to load vars for managed_node3 30575 1726867647.89933: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867647.89945: Calling all_plugins_play to load vars for managed_node3 30575 1726867647.89949: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867647.89952: Calling groups_plugins_play to load vars for managed_node3 30575 1726867647.93058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867647.96430: done with get_vars() 30575 1726867647.96458: done getting variables 30575 1726867647.96636: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:27 -0400 (0:00:00.108) 0:01:23.344 ****** 30575 1726867647.96675: entering _queue_task() for managed_node3/copy 30575 1726867647.97667: worker is 1 (out of 1 available) 30575 1726867647.97681: exiting _queue_task() for managed_node3/copy 30575 1726867647.97692: done queuing things up, now waiting for results queue to drain 30575 1726867647.97693: waiting for pending results... 30575 1726867647.98397: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867647.98785: in run() - task 0affcac9-a3a5-e081-a588-0000000019ce 30575 1726867647.98790: variable 'ansible_search_path' from source: unknown 30575 1726867647.98793: variable 'ansible_search_path' from source: unknown 30575 1726867647.98796: calling self._execute() 30575 1726867647.98854: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867647.98861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867647.98872: variable 'omit' from source: magic vars 30575 1726867647.99740: variable 'ansible_distribution_major_version' from source: facts 30575 1726867647.99752: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867648.00317: variable 'network_provider' from source: set_fact 30575 1726867648.00329: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867648.00337: when evaluation is False, skipping this task 30575 1726867648.00344: _execute() done 30575 1726867648.00352: dumping result to json 30575 1726867648.00359: done dumping result, returning 30575 1726867648.00372: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-0000000019ce] 30575 1726867648.00385: sending task result for task 0affcac9-a3a5-e081-a588-0000000019ce skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867648.00546: no more pending results, returning what we have 30575 1726867648.00550: results queue empty 30575 1726867648.00551: checking for any_errors_fatal 30575 1726867648.00558: done checking for any_errors_fatal 30575 1726867648.00559: checking for max_fail_percentage 30575 1726867648.00560: done checking for max_fail_percentage 30575 1726867648.00561: checking to see if all hosts have failed and the running result is not ok 30575 1726867648.00563: done checking to see if all hosts have failed 30575 1726867648.00563: getting the remaining hosts for this loop 30575 1726867648.00565: done getting the remaining hosts for this loop 30575 1726867648.00569: getting the next task for host managed_node3 30575 1726867648.00581: done getting next task for host managed_node3 30575 1726867648.00585: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867648.00590: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867648.00622: getting variables 30575 1726867648.00625: in VariableManager get_vars() 30575 1726867648.00671: Calling all_inventory to load vars for managed_node3 30575 1726867648.00673: Calling groups_inventory to load vars for managed_node3 30575 1726867648.00676: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867648.00793: Calling all_plugins_play to load vars for managed_node3 30575 1726867648.00797: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867648.00800: Calling groups_plugins_play to load vars for managed_node3 30575 1726867648.02546: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019ce 30575 1726867648.03063: WORKER PROCESS EXITING 30575 1726867648.03865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867648.07260: done with get_vars() 30575 1726867648.07290: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:28 -0400 (0:00:00.108) 0:01:23.452 ****** 30575 1726867648.07496: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867648.08501: worker is 1 (out of 1 available) 30575 1726867648.08511: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867648.08522: done queuing things up, now waiting for results queue to drain 30575 1726867648.08523: waiting for pending results... 30575 1726867648.09093: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867648.09209: in run() - task 0affcac9-a3a5-e081-a588-0000000019cf 30575 1726867648.09336: variable 'ansible_search_path' from source: unknown 30575 1726867648.09341: variable 'ansible_search_path' from source: unknown 30575 1726867648.09387: calling self._execute() 30575 1726867648.09602: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.09611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.09619: variable 'omit' from source: magic vars 30575 1726867648.10497: variable 'ansible_distribution_major_version' from source: facts 30575 1726867648.10510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867648.10514: variable 'omit' from source: magic vars 30575 1726867648.10624: variable 'omit' from source: magic vars 30575 1726867648.11066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867648.16129: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867648.16316: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867648.16359: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867648.16512: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867648.16538: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867648.16704: variable 'network_provider' from source: set_fact 30575 1726867648.17014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867648.17043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867648.17067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867648.17226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867648.17240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867648.17343: variable 'omit' from source: magic vars 30575 1726867648.17557: variable 'omit' from source: magic vars 30575 1726867648.17775: variable 'network_connections' from source: include params 30575 1726867648.17789: variable 'interface' from source: play vars 30575 1726867648.18060: variable 'interface' from source: play vars 30575 1726867648.18220: variable 'omit' from source: magic vars 30575 1726867648.18228: variable '__lsr_ansible_managed' from source: task vars 30575 1726867648.18485: variable '__lsr_ansible_managed' from source: task vars 30575 1726867648.18805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867648.19205: Loaded config def from plugin (lookup/template) 30575 1726867648.19212: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867648.19355: File lookup term: get_ansible_managed.j2 30575 1726867648.19358: variable 'ansible_search_path' from source: unknown 30575 1726867648.19361: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867648.19479: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867648.19483: variable 'ansible_search_path' from source: unknown 30575 1726867648.30771: variable 'ansible_managed' from source: unknown 30575 1726867648.30927: variable 'omit' from source: magic vars 30575 1726867648.31183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867648.31189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867648.31191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867648.31193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867648.31195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867648.31197: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867648.31203: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.31291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.31502: Set connection var ansible_pipelining to False 30575 1726867648.31505: Set connection var ansible_shell_type to sh 30575 1726867648.31511: Set connection var ansible_shell_executable to /bin/sh 30575 1726867648.31531: Set connection var ansible_timeout to 10 30575 1726867648.31534: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867648.31536: Set connection var ansible_connection to ssh 30575 1726867648.31612: variable 'ansible_shell_executable' from source: unknown 30575 1726867648.31615: variable 'ansible_connection' from source: unknown 30575 1726867648.31618: variable 'ansible_module_compression' from source: unknown 30575 1726867648.31620: variable 'ansible_shell_type' from source: unknown 30575 1726867648.31622: variable 'ansible_shell_executable' from source: unknown 30575 1726867648.31625: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.31627: variable 'ansible_pipelining' from source: unknown 30575 1726867648.31629: variable 'ansible_timeout' from source: unknown 30575 1726867648.31637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.32076: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867648.32091: variable 'omit' from source: magic vars 30575 1726867648.32094: starting attempt loop 30575 1726867648.32096: running the handler 30575 1726867648.32098: _low_level_execute_command(): starting 30575 1726867648.32100: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867648.33130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867648.33173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867648.33230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867648.33270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867648.33319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867648.35251: stdout chunk (state=3): >>>/root <<< 30575 1726867648.35255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867648.35263: stdout chunk (state=3): >>><<< 30575 1726867648.35266: stderr chunk (state=3): >>><<< 30575 1726867648.35288: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867648.35417: _low_level_execute_command(): starting 30575 1726867648.35422: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913 `" && echo ansible-tmp-1726867648.3540287-34522-135884958864913="` echo /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913 `" ) && sleep 0' 30575 1726867648.36288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867648.36301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867648.36394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867648.36437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867648.36456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867648.36472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867648.36594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867648.38430: stdout chunk (state=3): >>>ansible-tmp-1726867648.3540287-34522-135884958864913=/root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913 <<< 30575 1726867648.38616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867648.38619: stdout chunk (state=3): >>><<< 30575 1726867648.38621: stderr chunk (state=3): >>><<< 30575 1726867648.38641: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867648.3540287-34522-135884958864913=/root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867648.38930: variable 'ansible_module_compression' from source: unknown 30575 1726867648.38934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867648.38936: variable 'ansible_facts' from source: unknown 30575 1726867648.39282: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/AnsiballZ_network_connections.py 30575 1726867648.39504: Sending initial data 30575 1726867648.39507: Sent initial data (168 bytes) 30575 1726867648.40268: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867648.40329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867648.40434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867648.40475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867648.40501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867648.40562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867648.42116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867648.42185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867648.42236: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3atezulu /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/AnsiballZ_network_connections.py <<< 30575 1726867648.42259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/AnsiballZ_network_connections.py" <<< 30575 1726867648.42286: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3atezulu" to remote "/root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/AnsiballZ_network_connections.py" <<< 30575 1726867648.43604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867648.43607: stdout chunk (state=3): >>><<< 30575 1726867648.43609: stderr chunk (state=3): >>><<< 30575 1726867648.43612: done transferring module to remote 30575 1726867648.43614: _low_level_execute_command(): starting 30575 1726867648.43616: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/ /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/AnsiballZ_network_connections.py && sleep 0' 30575 1726867648.44402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867648.44421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867648.44519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867648.44656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867648.44688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867648.44802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867648.46749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867648.46753: stdout chunk (state=3): >>><<< 30575 1726867648.46755: stderr chunk (state=3): >>><<< 30575 1726867648.46757: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867648.46759: _low_level_execute_command(): starting 30575 1726867648.46761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/AnsiballZ_network_connections.py && sleep 0' 30575 1726867648.47682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867648.47882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867648.48131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867648.48213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867648.72808: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867648.74769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867648.74780: stdout chunk (state=3): >>><<< 30575 1726867648.74783: stderr chunk (state=3): >>><<< 30575 1726867648.74786: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867648.74790: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867648.74792: _low_level_execute_command(): starting 30575 1726867648.74954: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867648.3540287-34522-135884958864913/ > /dev/null 2>&1 && sleep 0' 30575 1726867648.76297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867648.76462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867648.76465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867648.78275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867648.78306: stderr chunk (state=3): >>><<< 30575 1726867648.78310: stdout chunk (state=3): >>><<< 30575 1726867648.78330: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867648.78334: handler run complete 30575 1726867648.78587: attempt loop complete, returning result 30575 1726867648.78589: _execute() done 30575 1726867648.78591: dumping result to json 30575 1726867648.78593: done dumping result, returning 30575 1726867648.78595: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-0000000019cf] 30575 1726867648.78596: sending task result for task 0affcac9-a3a5-e081-a588-0000000019cf 30575 1726867648.78680: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019cf 30575 1726867648.78683: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9 skipped because already active 30575 1726867648.78773: no more pending results, returning what we have 30575 1726867648.78794: results queue empty 30575 1726867648.78795: checking for any_errors_fatal 30575 1726867648.78800: done checking for any_errors_fatal 30575 1726867648.78801: checking for max_fail_percentage 30575 1726867648.78803: done checking for max_fail_percentage 30575 1726867648.78804: checking to see if all hosts have failed and the running result is not ok 30575 1726867648.78804: done checking to see if all hosts have failed 30575 1726867648.78805: getting the remaining hosts for this loop 30575 1726867648.78807: done getting the remaining hosts for this loop 30575 1726867648.78810: getting the next task for host managed_node3 30575 1726867648.78817: done getting next task for host managed_node3 30575 1726867648.78825: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867648.78829: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867648.78850: getting variables 30575 1726867648.78852: in VariableManager get_vars() 30575 1726867648.78937: Calling all_inventory to load vars for managed_node3 30575 1726867648.78939: Calling groups_inventory to load vars for managed_node3 30575 1726867648.78942: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867648.78951: Calling all_plugins_play to load vars for managed_node3 30575 1726867648.78953: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867648.78956: Calling groups_plugins_play to load vars for managed_node3 30575 1726867648.81489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867648.83927: done with get_vars() 30575 1726867648.83948: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:28 -0400 (0:00:00.765) 0:01:24.218 ****** 30575 1726867648.84050: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867648.84549: worker is 1 (out of 1 available) 30575 1726867648.84565: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867648.84605: done queuing things up, now waiting for results queue to drain 30575 1726867648.84609: waiting for pending results... 30575 1726867648.84940: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867648.84971: in run() - task 0affcac9-a3a5-e081-a588-0000000019d0 30575 1726867648.84987: variable 'ansible_search_path' from source: unknown 30575 1726867648.84991: variable 'ansible_search_path' from source: unknown 30575 1726867648.85031: calling self._execute() 30575 1726867648.85147: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.85151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.85162: variable 'omit' from source: magic vars 30575 1726867648.85565: variable 'ansible_distribution_major_version' from source: facts 30575 1726867648.85573: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867648.85689: variable 'network_state' from source: role '' defaults 30575 1726867648.85698: Evaluated conditional (network_state != {}): False 30575 1726867648.85701: when evaluation is False, skipping this task 30575 1726867648.85704: _execute() done 30575 1726867648.85709: dumping result to json 30575 1726867648.85711: done dumping result, returning 30575 1726867648.85718: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-0000000019d0] 30575 1726867648.85811: sending task result for task 0affcac9-a3a5-e081-a588-0000000019d0 30575 1726867648.85869: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019d0 30575 1726867648.85871: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867648.85934: no more pending results, returning what we have 30575 1726867648.85938: results queue empty 30575 1726867648.85939: checking for any_errors_fatal 30575 1726867648.85946: done checking for any_errors_fatal 30575 1726867648.85947: checking for max_fail_percentage 30575 1726867648.85949: done checking for max_fail_percentage 30575 1726867648.85949: checking to see if all hosts have failed and the running result is not ok 30575 1726867648.85950: done checking to see if all hosts have failed 30575 1726867648.85951: getting the remaining hosts for this loop 30575 1726867648.85952: done getting the remaining hosts for this loop 30575 1726867648.85955: getting the next task for host managed_node3 30575 1726867648.85961: done getting next task for host managed_node3 30575 1726867648.85965: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867648.85969: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867648.86097: getting variables 30575 1726867648.86098: in VariableManager get_vars() 30575 1726867648.86131: Calling all_inventory to load vars for managed_node3 30575 1726867648.86133: Calling groups_inventory to load vars for managed_node3 30575 1726867648.86135: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867648.86143: Calling all_plugins_play to load vars for managed_node3 30575 1726867648.86146: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867648.86148: Calling groups_plugins_play to load vars for managed_node3 30575 1726867648.87770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867648.89993: done with get_vars() 30575 1726867648.90015: done getting variables 30575 1726867648.90073: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:28 -0400 (0:00:00.060) 0:01:24.278 ****** 30575 1726867648.90132: entering _queue_task() for managed_node3/debug 30575 1726867648.90468: worker is 1 (out of 1 available) 30575 1726867648.90590: exiting _queue_task() for managed_node3/debug 30575 1726867648.90601: done queuing things up, now waiting for results queue to drain 30575 1726867648.90603: waiting for pending results... 30575 1726867648.90860: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867648.91185: in run() - task 0affcac9-a3a5-e081-a588-0000000019d1 30575 1726867648.91189: variable 'ansible_search_path' from source: unknown 30575 1726867648.91192: variable 'ansible_search_path' from source: unknown 30575 1726867648.91194: calling self._execute() 30575 1726867648.91197: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.91200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.91202: variable 'omit' from source: magic vars 30575 1726867648.91566: variable 'ansible_distribution_major_version' from source: facts 30575 1726867648.91579: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867648.91586: variable 'omit' from source: magic vars 30575 1726867648.91654: variable 'omit' from source: magic vars 30575 1726867648.91689: variable 'omit' from source: magic vars 30575 1726867648.91730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867648.91769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867648.91788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867648.91805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867648.91821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867648.91850: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867648.91854: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.91946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.91976: Set connection var ansible_pipelining to False 30575 1726867648.91981: Set connection var ansible_shell_type to sh 30575 1726867648.91984: Set connection var ansible_shell_executable to /bin/sh 30575 1726867648.91986: Set connection var ansible_timeout to 10 30575 1726867648.91992: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867648.92010: Set connection var ansible_connection to ssh 30575 1726867648.92025: variable 'ansible_shell_executable' from source: unknown 30575 1726867648.92028: variable 'ansible_connection' from source: unknown 30575 1726867648.92031: variable 'ansible_module_compression' from source: unknown 30575 1726867648.92033: variable 'ansible_shell_type' from source: unknown 30575 1726867648.92035: variable 'ansible_shell_executable' from source: unknown 30575 1726867648.92037: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.92041: variable 'ansible_pipelining' from source: unknown 30575 1726867648.92044: variable 'ansible_timeout' from source: unknown 30575 1726867648.92053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.92194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867648.92198: variable 'omit' from source: magic vars 30575 1726867648.92205: starting attempt loop 30575 1726867648.92208: running the handler 30575 1726867648.92343: variable '__network_connections_result' from source: set_fact 30575 1726867648.92409: handler run complete 30575 1726867648.92484: attempt loop complete, returning result 30575 1726867648.92487: _execute() done 30575 1726867648.92491: dumping result to json 30575 1726867648.92493: done dumping result, returning 30575 1726867648.92497: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-0000000019d1] 30575 1726867648.92500: sending task result for task 0affcac9-a3a5-e081-a588-0000000019d1 30575 1726867648.92565: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019d1 30575 1726867648.92567: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9 skipped because already active" ] } 30575 1726867648.92678: no more pending results, returning what we have 30575 1726867648.92683: results queue empty 30575 1726867648.92685: checking for any_errors_fatal 30575 1726867648.92692: done checking for any_errors_fatal 30575 1726867648.92693: checking for max_fail_percentage 30575 1726867648.92696: done checking for max_fail_percentage 30575 1726867648.92698: checking to see if all hosts have failed and the running result is not ok 30575 1726867648.92698: done checking to see if all hosts have failed 30575 1726867648.92699: getting the remaining hosts for this loop 30575 1726867648.92700: done getting the remaining hosts for this loop 30575 1726867648.92704: getting the next task for host managed_node3 30575 1726867648.92715: done getting next task for host managed_node3 30575 1726867648.92719: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867648.92726: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867648.92741: getting variables 30575 1726867648.92743: in VariableManager get_vars() 30575 1726867648.92927: Calling all_inventory to load vars for managed_node3 30575 1726867648.92930: Calling groups_inventory to load vars for managed_node3 30575 1726867648.92932: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867648.92940: Calling all_plugins_play to load vars for managed_node3 30575 1726867648.92943: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867648.92946: Calling groups_plugins_play to load vars for managed_node3 30575 1726867648.94536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867648.96275: done with get_vars() 30575 1726867648.96298: done getting variables 30575 1726867648.96364: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:28 -0400 (0:00:00.062) 0:01:24.341 ****** 30575 1726867648.96404: entering _queue_task() for managed_node3/debug 30575 1726867648.96824: worker is 1 (out of 1 available) 30575 1726867648.96837: exiting _queue_task() for managed_node3/debug 30575 1726867648.96854: done queuing things up, now waiting for results queue to drain 30575 1726867648.96856: waiting for pending results... 30575 1726867648.97610: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867648.98026: in run() - task 0affcac9-a3a5-e081-a588-0000000019d2 30575 1726867648.98031: variable 'ansible_search_path' from source: unknown 30575 1726867648.98034: variable 'ansible_search_path' from source: unknown 30575 1726867648.98091: calling self._execute() 30575 1726867648.98322: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.98326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.98364: variable 'omit' from source: magic vars 30575 1726867648.99141: variable 'ansible_distribution_major_version' from source: facts 30575 1726867648.99152: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867648.99159: variable 'omit' from source: magic vars 30575 1726867648.99359: variable 'omit' from source: magic vars 30575 1726867648.99399: variable 'omit' from source: magic vars 30575 1726867648.99438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867648.99472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867648.99502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867648.99583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867648.99587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867648.99590: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867648.99594: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.99597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.99679: Set connection var ansible_pipelining to False 30575 1726867648.99683: Set connection var ansible_shell_type to sh 30575 1726867648.99689: Set connection var ansible_shell_executable to /bin/sh 30575 1726867648.99694: Set connection var ansible_timeout to 10 30575 1726867648.99700: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867648.99716: Set connection var ansible_connection to ssh 30575 1726867648.99737: variable 'ansible_shell_executable' from source: unknown 30575 1726867648.99741: variable 'ansible_connection' from source: unknown 30575 1726867648.99745: variable 'ansible_module_compression' from source: unknown 30575 1726867648.99747: variable 'ansible_shell_type' from source: unknown 30575 1726867648.99749: variable 'ansible_shell_executable' from source: unknown 30575 1726867648.99752: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867648.99755: variable 'ansible_pipelining' from source: unknown 30575 1726867648.99757: variable 'ansible_timeout' from source: unknown 30575 1726867648.99760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867648.99938: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867648.99943: variable 'omit' from source: magic vars 30575 1726867648.99946: starting attempt loop 30575 1726867648.99949: running the handler 30575 1726867649.00082: variable '__network_connections_result' from source: set_fact 30575 1726867649.00090: variable '__network_connections_result' from source: set_fact 30575 1726867649.00162: handler run complete 30575 1726867649.00232: attempt loop complete, returning result 30575 1726867649.00235: _execute() done 30575 1726867649.00238: dumping result to json 30575 1726867649.00241: done dumping result, returning 30575 1726867649.00243: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-0000000019d2] 30575 1726867649.00261: sending task result for task 0affcac9-a3a5-e081-a588-0000000019d2 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 907d8824-891a-4719-b02a-cbadb34e89d9 skipped because already active" ] } } 30575 1726867649.00487: no more pending results, returning what we have 30575 1726867649.00490: results queue empty 30575 1726867649.00491: checking for any_errors_fatal 30575 1726867649.00500: done checking for any_errors_fatal 30575 1726867649.00501: checking for max_fail_percentage 30575 1726867649.00503: done checking for max_fail_percentage 30575 1726867649.00504: checking to see if all hosts have failed and the running result is not ok 30575 1726867649.00505: done checking to see if all hosts have failed 30575 1726867649.00505: getting the remaining hosts for this loop 30575 1726867649.00510: done getting the remaining hosts for this loop 30575 1726867649.00514: getting the next task for host managed_node3 30575 1726867649.00525: done getting next task for host managed_node3 30575 1726867649.00529: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867649.00536: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867649.00549: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019d2 30575 1726867649.00552: WORKER PROCESS EXITING 30575 1726867649.00686: getting variables 30575 1726867649.00688: in VariableManager get_vars() 30575 1726867649.00726: Calling all_inventory to load vars for managed_node3 30575 1726867649.00728: Calling groups_inventory to load vars for managed_node3 30575 1726867649.00736: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867649.00746: Calling all_plugins_play to load vars for managed_node3 30575 1726867649.00749: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867649.00752: Calling groups_plugins_play to load vars for managed_node3 30575 1726867649.03303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867649.05354: done with get_vars() 30575 1726867649.05387: done getting variables 30575 1726867649.05450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:29 -0400 (0:00:00.090) 0:01:24.432 ****** 30575 1726867649.05490: entering _queue_task() for managed_node3/debug 30575 1726867649.05855: worker is 1 (out of 1 available) 30575 1726867649.05868: exiting _queue_task() for managed_node3/debug 30575 1726867649.05985: done queuing things up, now waiting for results queue to drain 30575 1726867649.05987: waiting for pending results... 30575 1726867649.06194: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867649.06615: in run() - task 0affcac9-a3a5-e081-a588-0000000019d3 30575 1726867649.06620: variable 'ansible_search_path' from source: unknown 30575 1726867649.06623: variable 'ansible_search_path' from source: unknown 30575 1726867649.06626: calling self._execute() 30575 1726867649.06629: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867649.06631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867649.06634: variable 'omit' from source: magic vars 30575 1726867649.07171: variable 'ansible_distribution_major_version' from source: facts 30575 1726867649.07185: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867649.07512: variable 'network_state' from source: role '' defaults 30575 1726867649.07635: Evaluated conditional (network_state != {}): False 30575 1726867649.07639: when evaluation is False, skipping this task 30575 1726867649.07641: _execute() done 30575 1726867649.07645: dumping result to json 30575 1726867649.07647: done dumping result, returning 30575 1726867649.07660: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-0000000019d3] 30575 1726867649.07665: sending task result for task 0affcac9-a3a5-e081-a588-0000000019d3 30575 1726867649.07853: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019d3 30575 1726867649.07857: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867649.07910: no more pending results, returning what we have 30575 1726867649.07914: results queue empty 30575 1726867649.07915: checking for any_errors_fatal 30575 1726867649.07927: done checking for any_errors_fatal 30575 1726867649.07928: checking for max_fail_percentage 30575 1726867649.07930: done checking for max_fail_percentage 30575 1726867649.07931: checking to see if all hosts have failed and the running result is not ok 30575 1726867649.07932: done checking to see if all hosts have failed 30575 1726867649.07933: getting the remaining hosts for this loop 30575 1726867649.07935: done getting the remaining hosts for this loop 30575 1726867649.07939: getting the next task for host managed_node3 30575 1726867649.07948: done getting next task for host managed_node3 30575 1726867649.07952: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867649.07958: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867649.07987: getting variables 30575 1726867649.07989: in VariableManager get_vars() 30575 1726867649.08034: Calling all_inventory to load vars for managed_node3 30575 1726867649.08037: Calling groups_inventory to load vars for managed_node3 30575 1726867649.08040: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867649.08053: Calling all_plugins_play to load vars for managed_node3 30575 1726867649.08056: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867649.08059: Calling groups_plugins_play to load vars for managed_node3 30575 1726867649.11742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867649.16296: done with get_vars() 30575 1726867649.16324: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:29 -0400 (0:00:00.109) 0:01:24.541 ****** 30575 1726867649.16425: entering _queue_task() for managed_node3/ping 30575 1726867649.17193: worker is 1 (out of 1 available) 30575 1726867649.17206: exiting _queue_task() for managed_node3/ping 30575 1726867649.17219: done queuing things up, now waiting for results queue to drain 30575 1726867649.17221: waiting for pending results... 30575 1726867649.17890: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867649.18114: in run() - task 0affcac9-a3a5-e081-a588-0000000019d4 30575 1726867649.18121: variable 'ansible_search_path' from source: unknown 30575 1726867649.18125: variable 'ansible_search_path' from source: unknown 30575 1726867649.18332: calling self._execute() 30575 1726867649.18410: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867649.18414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867649.18439: variable 'omit' from source: magic vars 30575 1726867649.19233: variable 'ansible_distribution_major_version' from source: facts 30575 1726867649.19245: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867649.19252: variable 'omit' from source: magic vars 30575 1726867649.19422: variable 'omit' from source: magic vars 30575 1726867649.19566: variable 'omit' from source: magic vars 30575 1726867649.19612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867649.19646: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867649.19667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867649.19814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867649.19817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867649.19835: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867649.19838: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867649.19841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867649.19949: Set connection var ansible_pipelining to False 30575 1726867649.19952: Set connection var ansible_shell_type to sh 30575 1726867649.20243: Set connection var ansible_shell_executable to /bin/sh 30575 1726867649.20246: Set connection var ansible_timeout to 10 30575 1726867649.20249: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867649.20252: Set connection var ansible_connection to ssh 30575 1726867649.20254: variable 'ansible_shell_executable' from source: unknown 30575 1726867649.20256: variable 'ansible_connection' from source: unknown 30575 1726867649.20258: variable 'ansible_module_compression' from source: unknown 30575 1726867649.20260: variable 'ansible_shell_type' from source: unknown 30575 1726867649.20262: variable 'ansible_shell_executable' from source: unknown 30575 1726867649.20264: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867649.20266: variable 'ansible_pipelining' from source: unknown 30575 1726867649.20268: variable 'ansible_timeout' from source: unknown 30575 1726867649.20270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867649.20610: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867649.20614: variable 'omit' from source: magic vars 30575 1726867649.20617: starting attempt loop 30575 1726867649.20619: running the handler 30575 1726867649.20747: _low_level_execute_command(): starting 30575 1726867649.20755: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867649.22258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867649.22262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867649.22264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867649.22267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867649.22270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867649.22271: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867649.22273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867649.22412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867649.22724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867649.22784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867649.24492: stdout chunk (state=3): >>>/root <<< 30575 1726867649.24550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867649.24666: stderr chunk (state=3): >>><<< 30575 1726867649.24669: stdout chunk (state=3): >>><<< 30575 1726867649.24760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867649.24764: _low_level_execute_command(): starting 30575 1726867649.24767: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417 `" && echo ansible-tmp-1726867649.2469523-34600-186332018360417="` echo /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417 `" ) && sleep 0' 30575 1726867649.25781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867649.25849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867649.25894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867649.26143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867649.26147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867649.26149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867649.26181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867649.28063: stdout chunk (state=3): >>>ansible-tmp-1726867649.2469523-34600-186332018360417=/root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417 <<< 30575 1726867649.28271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867649.28275: stdout chunk (state=3): >>><<< 30575 1726867649.28279: stderr chunk (state=3): >>><<< 30575 1726867649.28282: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867649.2469523-34600-186332018360417=/root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867649.28284: variable 'ansible_module_compression' from source: unknown 30575 1726867649.28322: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867649.28359: variable 'ansible_facts' from source: unknown 30575 1726867649.28647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/AnsiballZ_ping.py 30575 1726867649.29035: Sending initial data 30575 1726867649.29039: Sent initial data (153 bytes) 30575 1726867649.30046: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867649.30292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867649.30458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867649.30578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867649.32144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867649.32156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867649.32213: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpjt0wx44e /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/AnsiballZ_ping.py <<< 30575 1726867649.32217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/AnsiballZ_ping.py" <<< 30575 1726867649.32384: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpjt0wx44e" to remote "/root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/AnsiballZ_ping.py" <<< 30575 1726867649.34092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867649.34128: stderr chunk (state=3): >>><<< 30575 1726867649.34357: stdout chunk (state=3): >>><<< 30575 1726867649.34360: done transferring module to remote 30575 1726867649.34362: _low_level_execute_command(): starting 30575 1726867649.34365: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/ /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/AnsiballZ_ping.py && sleep 0' 30575 1726867649.35514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867649.35591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867649.35767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867649.35791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867649.35846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867649.37673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867649.37687: stdout chunk (state=3): >>><<< 30575 1726867649.37699: stderr chunk (state=3): >>><<< 30575 1726867649.37723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867649.37737: _low_level_execute_command(): starting 30575 1726867649.37747: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/AnsiballZ_ping.py && sleep 0' 30575 1726867649.39157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867649.39384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867649.39595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867649.39848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867649.54841: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867649.56146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867649.56266: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867649.56294: stderr chunk (state=3): >>><<< 30575 1726867649.56297: stdout chunk (state=3): >>><<< 30575 1726867649.56320: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867649.56548: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867649.56553: _low_level_execute_command(): starting 30575 1726867649.56555: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867649.2469523-34600-186332018360417/ > /dev/null 2>&1 && sleep 0' 30575 1726867649.57597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867649.57615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867649.57630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867649.57653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867649.57674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867649.57863: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867649.57888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867649.58160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867649.60053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867649.60080: stdout chunk (state=3): >>><<< 30575 1726867649.60084: stderr chunk (state=3): >>><<< 30575 1726867649.60169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867649.60172: handler run complete 30575 1726867649.60174: attempt loop complete, returning result 30575 1726867649.60178: _execute() done 30575 1726867649.60180: dumping result to json 30575 1726867649.60183: done dumping result, returning 30575 1726867649.60185: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-0000000019d4] 30575 1726867649.60187: sending task result for task 0affcac9-a3a5-e081-a588-0000000019d4 ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867649.60452: no more pending results, returning what we have 30575 1726867649.60457: results queue empty 30575 1726867649.60463: checking for any_errors_fatal 30575 1726867649.60473: done checking for any_errors_fatal 30575 1726867649.60474: checking for max_fail_percentage 30575 1726867649.60475: done checking for max_fail_percentage 30575 1726867649.60476: checking to see if all hosts have failed and the running result is not ok 30575 1726867649.60480: done checking to see if all hosts have failed 30575 1726867649.60480: getting the remaining hosts for this loop 30575 1726867649.60482: done getting the remaining hosts for this loop 30575 1726867649.60486: getting the next task for host managed_node3 30575 1726867649.60498: done getting next task for host managed_node3 30575 1726867649.60501: ^ task is: TASK: meta (role_complete) 30575 1726867649.60509: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867649.60523: getting variables 30575 1726867649.60525: in VariableManager get_vars() 30575 1726867649.60570: Calling all_inventory to load vars for managed_node3 30575 1726867649.60573: Calling groups_inventory to load vars for managed_node3 30575 1726867649.60576: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867649.61092: Calling all_plugins_play to load vars for managed_node3 30575 1726867649.61095: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867649.61099: Calling groups_plugins_play to load vars for managed_node3 30575 1726867649.61985: done sending task result for task 0affcac9-a3a5-e081-a588-0000000019d4 30575 1726867649.61988: WORKER PROCESS EXITING 30575 1726867649.64125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867649.69571: done with get_vars() 30575 1726867649.70002: done getting variables 30575 1726867649.70089: done queuing things up, now waiting for results queue to drain 30575 1726867649.70092: results queue empty 30575 1726867649.70093: checking for any_errors_fatal 30575 1726867649.70095: done checking for any_errors_fatal 30575 1726867649.70096: checking for max_fail_percentage 30575 1726867649.70097: done checking for max_fail_percentage 30575 1726867649.70098: checking to see if all hosts have failed and the running result is not ok 30575 1726867649.70099: done checking to see if all hosts have failed 30575 1726867649.70099: getting the remaining hosts for this loop 30575 1726867649.70100: done getting the remaining hosts for this loop 30575 1726867649.70103: getting the next task for host managed_node3 30575 1726867649.70109: done getting next task for host managed_node3 30575 1726867649.70112: ^ task is: TASK: Include network role 30575 1726867649.70114: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867649.70117: getting variables 30575 1726867649.70118: in VariableManager get_vars() 30575 1726867649.70130: Calling all_inventory to load vars for managed_node3 30575 1726867649.70132: Calling groups_inventory to load vars for managed_node3 30575 1726867649.70134: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867649.70140: Calling all_plugins_play to load vars for managed_node3 30575 1726867649.70142: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867649.70145: Calling groups_plugins_play to load vars for managed_node3 30575 1726867649.72986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867649.77635: done with get_vars() 30575 1726867649.77665: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 17:27:29 -0400 (0:00:00.613) 0:01:25.154 ****** 30575 1726867649.77746: entering _queue_task() for managed_node3/include_role 30575 1726867649.78772: worker is 1 (out of 1 available) 30575 1726867649.78786: exiting _queue_task() for managed_node3/include_role 30575 1726867649.78802: done queuing things up, now waiting for results queue to drain 30575 1726867649.78804: waiting for pending results... 30575 1726867649.79385: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867649.79723: in run() - task 0affcac9-a3a5-e081-a588-0000000017d9 30575 1726867649.79727: variable 'ansible_search_path' from source: unknown 30575 1726867649.79730: variable 'ansible_search_path' from source: unknown 30575 1726867649.79766: calling self._execute() 30575 1726867649.79974: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867649.79980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867649.80223: variable 'omit' from source: magic vars 30575 1726867649.81025: variable 'ansible_distribution_major_version' from source: facts 30575 1726867649.81029: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867649.81032: _execute() done 30575 1726867649.81035: dumping result to json 30575 1726867649.81037: done dumping result, returning 30575 1726867649.81040: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-0000000017d9] 30575 1726867649.81042: sending task result for task 0affcac9-a3a5-e081-a588-0000000017d9 30575 1726867649.81145: done sending task result for task 0affcac9-a3a5-e081-a588-0000000017d9 30575 1726867649.81149: WORKER PROCESS EXITING 30575 1726867649.81180: no more pending results, returning what we have 30575 1726867649.81187: in VariableManager get_vars() 30575 1726867649.81241: Calling all_inventory to load vars for managed_node3 30575 1726867649.81245: Calling groups_inventory to load vars for managed_node3 30575 1726867649.81249: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867649.81264: Calling all_plugins_play to load vars for managed_node3 30575 1726867649.81268: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867649.81271: Calling groups_plugins_play to load vars for managed_node3 30575 1726867649.85674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867649.89703: done with get_vars() 30575 1726867649.89727: variable 'ansible_search_path' from source: unknown 30575 1726867649.89728: variable 'ansible_search_path' from source: unknown 30575 1726867649.90304: variable 'omit' from source: magic vars 30575 1726867649.90345: variable 'omit' from source: magic vars 30575 1726867649.90360: variable 'omit' from source: magic vars 30575 1726867649.90363: we have included files to process 30575 1726867649.90364: generating all_blocks data 30575 1726867649.90366: done generating all_blocks data 30575 1726867649.90369: processing included file: fedora.linux_system_roles.network 30575 1726867649.90493: in VariableManager get_vars() 30575 1726867649.90509: done with get_vars() 30575 1726867649.90545: in VariableManager get_vars() 30575 1726867649.90562: done with get_vars() 30575 1726867649.90652: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867649.90975: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867649.91178: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867649.92226: in VariableManager get_vars() 30575 1726867649.92248: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867649.96445: iterating over new_blocks loaded from include file 30575 1726867649.96447: in VariableManager get_vars() 30575 1726867649.96495: done with get_vars() 30575 1726867649.96497: filtering new block on tags 30575 1726867649.97154: done filtering new block on tags 30575 1726867649.97158: in VariableManager get_vars() 30575 1726867649.97175: done with get_vars() 30575 1726867649.97178: filtering new block on tags 30575 1726867649.97287: done filtering new block on tags 30575 1726867649.97290: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867649.97326: extending task lists for all hosts with included blocks 30575 1726867649.97502: done extending task lists 30575 1726867649.97503: done processing included files 30575 1726867649.97504: results queue empty 30575 1726867649.97505: checking for any_errors_fatal 30575 1726867649.97506: done checking for any_errors_fatal 30575 1726867649.97507: checking for max_fail_percentage 30575 1726867649.97508: done checking for max_fail_percentage 30575 1726867649.97509: checking to see if all hosts have failed and the running result is not ok 30575 1726867649.97510: done checking to see if all hosts have failed 30575 1726867649.97510: getting the remaining hosts for this loop 30575 1726867649.97511: done getting the remaining hosts for this loop 30575 1726867649.97629: getting the next task for host managed_node3 30575 1726867649.97635: done getting next task for host managed_node3 30575 1726867649.97637: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867649.97640: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867649.97657: getting variables 30575 1726867649.97659: in VariableManager get_vars() 30575 1726867649.97673: Calling all_inventory to load vars for managed_node3 30575 1726867649.97675: Calling groups_inventory to load vars for managed_node3 30575 1726867649.97679: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867649.97685: Calling all_plugins_play to load vars for managed_node3 30575 1726867649.97687: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867649.97689: Calling groups_plugins_play to load vars for managed_node3 30575 1726867650.00944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867650.05441: done with get_vars() 30575 1726867650.05463: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:30 -0400 (0:00:00.280) 0:01:25.435 ****** 30575 1726867650.05813: entering _queue_task() for managed_node3/include_tasks 30575 1726867650.07162: worker is 1 (out of 1 available) 30575 1726867650.07176: exiting _queue_task() for managed_node3/include_tasks 30575 1726867650.07190: done queuing things up, now waiting for results queue to drain 30575 1726867650.07192: waiting for pending results... 30575 1726867650.07958: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867650.08383: in run() - task 0affcac9-a3a5-e081-a588-000000001b3b 30575 1726867650.08463: variable 'ansible_search_path' from source: unknown 30575 1726867650.08509: variable 'ansible_search_path' from source: unknown 30575 1726867650.08559: calling self._execute() 30575 1726867650.08874: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867650.08914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867650.09047: variable 'omit' from source: magic vars 30575 1726867650.10137: variable 'ansible_distribution_major_version' from source: facts 30575 1726867650.10140: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867650.10143: _execute() done 30575 1726867650.10145: dumping result to json 30575 1726867650.10148: done dumping result, returning 30575 1726867650.10150: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-000000001b3b] 30575 1726867650.10152: sending task result for task 0affcac9-a3a5-e081-a588-000000001b3b 30575 1726867650.10280: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b3b 30575 1726867650.10342: no more pending results, returning what we have 30575 1726867650.10348: in VariableManager get_vars() 30575 1726867650.10405: Calling all_inventory to load vars for managed_node3 30575 1726867650.10409: Calling groups_inventory to load vars for managed_node3 30575 1726867650.10412: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867650.10692: Calling all_plugins_play to load vars for managed_node3 30575 1726867650.10697: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867650.10701: Calling groups_plugins_play to load vars for managed_node3 30575 1726867650.11221: WORKER PROCESS EXITING 30575 1726867650.14718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867650.19544: done with get_vars() 30575 1726867650.19588: variable 'ansible_search_path' from source: unknown 30575 1726867650.19589: variable 'ansible_search_path' from source: unknown 30575 1726867650.19629: we have included files to process 30575 1726867650.19631: generating all_blocks data 30575 1726867650.19633: done generating all_blocks data 30575 1726867650.19637: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867650.19792: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867650.19797: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867650.21313: done processing included file 30575 1726867650.21316: iterating over new_blocks loaded from include file 30575 1726867650.21318: in VariableManager get_vars() 30575 1726867650.21347: done with get_vars() 30575 1726867650.21350: filtering new block on tags 30575 1726867650.21534: done filtering new block on tags 30575 1726867650.21538: in VariableManager get_vars() 30575 1726867650.21563: done with get_vars() 30575 1726867650.21565: filtering new block on tags 30575 1726867650.21658: done filtering new block on tags 30575 1726867650.21662: in VariableManager get_vars() 30575 1726867650.21730: done with get_vars() 30575 1726867650.21732: filtering new block on tags 30575 1726867650.21896: done filtering new block on tags 30575 1726867650.21900: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867650.21906: extending task lists for all hosts with included blocks 30575 1726867650.26537: done extending task lists 30575 1726867650.26539: done processing included files 30575 1726867650.26540: results queue empty 30575 1726867650.26540: checking for any_errors_fatal 30575 1726867650.26544: done checking for any_errors_fatal 30575 1726867650.26545: checking for max_fail_percentage 30575 1726867650.26546: done checking for max_fail_percentage 30575 1726867650.26547: checking to see if all hosts have failed and the running result is not ok 30575 1726867650.26548: done checking to see if all hosts have failed 30575 1726867650.26548: getting the remaining hosts for this loop 30575 1726867650.26550: done getting the remaining hosts for this loop 30575 1726867650.26552: getting the next task for host managed_node3 30575 1726867650.26558: done getting next task for host managed_node3 30575 1726867650.26561: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867650.26565: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867650.26581: getting variables 30575 1726867650.26582: in VariableManager get_vars() 30575 1726867650.26599: Calling all_inventory to load vars for managed_node3 30575 1726867650.26601: Calling groups_inventory to load vars for managed_node3 30575 1726867650.26603: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867650.26642: Calling all_plugins_play to load vars for managed_node3 30575 1726867650.26647: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867650.26651: Calling groups_plugins_play to load vars for managed_node3 30575 1726867650.30428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867650.34727: done with get_vars() 30575 1726867650.34757: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:30 -0400 (0:00:00.290) 0:01:25.726 ****** 30575 1726867650.34851: entering _queue_task() for managed_node3/setup 30575 1726867650.36040: worker is 1 (out of 1 available) 30575 1726867650.36055: exiting _queue_task() for managed_node3/setup 30575 1726867650.36070: done queuing things up, now waiting for results queue to drain 30575 1726867650.36072: waiting for pending results... 30575 1726867650.37279: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867650.37558: in run() - task 0affcac9-a3a5-e081-a588-000000001b92 30575 1726867650.37783: variable 'ansible_search_path' from source: unknown 30575 1726867650.37787: variable 'ansible_search_path' from source: unknown 30575 1726867650.37791: calling self._execute() 30575 1726867650.37964: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867650.38038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867650.38054: variable 'omit' from source: magic vars 30575 1726867650.39218: variable 'ansible_distribution_major_version' from source: facts 30575 1726867650.39283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867650.39880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867650.44855: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867650.44937: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867650.45284: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867650.45288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867650.45385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867650.45389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867650.45422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867650.45514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867650.45558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867650.45883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867650.45887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867650.45889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867650.45891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867650.45894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867650.45896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867650.46215: variable '__network_required_facts' from source: role '' defaults 30575 1726867650.46394: variable 'ansible_facts' from source: unknown 30575 1726867650.47983: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867650.47987: when evaluation is False, skipping this task 30575 1726867650.47990: _execute() done 30575 1726867650.47992: dumping result to json 30575 1726867650.47995: done dumping result, returning 30575 1726867650.47997: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000001b92] 30575 1726867650.47999: sending task result for task 0affcac9-a3a5-e081-a588-000000001b92 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867650.48154: no more pending results, returning what we have 30575 1726867650.48160: results queue empty 30575 1726867650.48161: checking for any_errors_fatal 30575 1726867650.48163: done checking for any_errors_fatal 30575 1726867650.48163: checking for max_fail_percentage 30575 1726867650.48166: done checking for max_fail_percentage 30575 1726867650.48167: checking to see if all hosts have failed and the running result is not ok 30575 1726867650.48168: done checking to see if all hosts have failed 30575 1726867650.48169: getting the remaining hosts for this loop 30575 1726867650.48171: done getting the remaining hosts for this loop 30575 1726867650.48176: getting the next task for host managed_node3 30575 1726867650.48191: done getting next task for host managed_node3 30575 1726867650.48196: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867650.48203: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867650.48230: getting variables 30575 1726867650.48232: in VariableManager get_vars() 30575 1726867650.48402: Calling all_inventory to load vars for managed_node3 30575 1726867650.48405: Calling groups_inventory to load vars for managed_node3 30575 1726867650.48407: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867650.48419: Calling all_plugins_play to load vars for managed_node3 30575 1726867650.48423: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867650.48426: Calling groups_plugins_play to load vars for managed_node3 30575 1726867650.49658: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b92 30575 1726867650.49670: WORKER PROCESS EXITING 30575 1726867650.51627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867650.55659: done with get_vars() 30575 1726867650.55692: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:30 -0400 (0:00:00.211) 0:01:25.938 ****** 30575 1726867650.56049: entering _queue_task() for managed_node3/stat 30575 1726867650.56991: worker is 1 (out of 1 available) 30575 1726867650.57005: exiting _queue_task() for managed_node3/stat 30575 1726867650.57021: done queuing things up, now waiting for results queue to drain 30575 1726867650.57023: waiting for pending results... 30575 1726867650.57628: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867650.57884: in run() - task 0affcac9-a3a5-e081-a588-000000001b94 30575 1726867650.57933: variable 'ansible_search_path' from source: unknown 30575 1726867650.57937: variable 'ansible_search_path' from source: unknown 30575 1726867650.57974: calling self._execute() 30575 1726867650.58215: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867650.58219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867650.58372: variable 'omit' from source: magic vars 30575 1726867650.59097: variable 'ansible_distribution_major_version' from source: facts 30575 1726867650.59232: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867650.59580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867650.60225: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867650.60269: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867650.60482: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867650.60485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867650.60760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867650.60787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867650.60813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867650.61066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867650.61070: variable '__network_is_ostree' from source: set_fact 30575 1726867650.61073: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867650.61179: when evaluation is False, skipping this task 30575 1726867650.61183: _execute() done 30575 1726867650.61186: dumping result to json 30575 1726867650.61190: done dumping result, returning 30575 1726867650.61200: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000001b94] 30575 1726867650.61205: sending task result for task 0affcac9-a3a5-e081-a588-000000001b94 30575 1726867650.61310: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b94 30575 1726867650.61314: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867650.61368: no more pending results, returning what we have 30575 1726867650.61373: results queue empty 30575 1726867650.61374: checking for any_errors_fatal 30575 1726867650.61392: done checking for any_errors_fatal 30575 1726867650.61394: checking for max_fail_percentage 30575 1726867650.61396: done checking for max_fail_percentage 30575 1726867650.61397: checking to see if all hosts have failed and the running result is not ok 30575 1726867650.61398: done checking to see if all hosts have failed 30575 1726867650.61399: getting the remaining hosts for this loop 30575 1726867650.61401: done getting the remaining hosts for this loop 30575 1726867650.61405: getting the next task for host managed_node3 30575 1726867650.61415: done getting next task for host managed_node3 30575 1726867650.61420: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867650.61426: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867650.61459: getting variables 30575 1726867650.61462: in VariableManager get_vars() 30575 1726867650.61757: Calling all_inventory to load vars for managed_node3 30575 1726867650.61760: Calling groups_inventory to load vars for managed_node3 30575 1726867650.61763: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867650.61773: Calling all_plugins_play to load vars for managed_node3 30575 1726867650.61778: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867650.61782: Calling groups_plugins_play to load vars for managed_node3 30575 1726867650.65296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867650.68687: done with get_vars() 30575 1726867650.68714: done getting variables 30575 1726867650.68901: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:30 -0400 (0:00:00.128) 0:01:26.066 ****** 30575 1726867650.68945: entering _queue_task() for managed_node3/set_fact 30575 1726867650.70055: worker is 1 (out of 1 available) 30575 1726867650.70067: exiting _queue_task() for managed_node3/set_fact 30575 1726867650.70081: done queuing things up, now waiting for results queue to drain 30575 1726867650.70083: waiting for pending results... 30575 1726867650.70525: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867650.70727: in run() - task 0affcac9-a3a5-e081-a588-000000001b95 30575 1726867650.70832: variable 'ansible_search_path' from source: unknown 30575 1726867650.70837: variable 'ansible_search_path' from source: unknown 30575 1726867650.71051: calling self._execute() 30575 1726867650.71089: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867650.71094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867650.71105: variable 'omit' from source: magic vars 30575 1726867650.72246: variable 'ansible_distribution_major_version' from source: facts 30575 1726867650.72258: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867650.72617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867650.73334: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867650.73417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867650.73456: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867650.73590: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867650.73685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867650.73712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867650.73986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867650.73990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867650.74093: variable '__network_is_ostree' from source: set_fact 30575 1726867650.74097: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867650.74283: when evaluation is False, skipping this task 30575 1726867650.74286: _execute() done 30575 1726867650.74289: dumping result to json 30575 1726867650.74291: done dumping result, returning 30575 1726867650.74294: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000001b95] 30575 1726867650.74296: sending task result for task 0affcac9-a3a5-e081-a588-000000001b95 30575 1726867650.74363: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b95 30575 1726867650.74367: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867650.74432: no more pending results, returning what we have 30575 1726867650.74437: results queue empty 30575 1726867650.74438: checking for any_errors_fatal 30575 1726867650.74444: done checking for any_errors_fatal 30575 1726867650.74445: checking for max_fail_percentage 30575 1726867650.74447: done checking for max_fail_percentage 30575 1726867650.74448: checking to see if all hosts have failed and the running result is not ok 30575 1726867650.74449: done checking to see if all hosts have failed 30575 1726867650.74451: getting the remaining hosts for this loop 30575 1726867650.74453: done getting the remaining hosts for this loop 30575 1726867650.74457: getting the next task for host managed_node3 30575 1726867650.74471: done getting next task for host managed_node3 30575 1726867650.74476: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867650.74484: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867650.74515: getting variables 30575 1726867650.74517: in VariableManager get_vars() 30575 1726867650.74561: Calling all_inventory to load vars for managed_node3 30575 1726867650.74564: Calling groups_inventory to load vars for managed_node3 30575 1726867650.74566: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867650.74576: Calling all_plugins_play to load vars for managed_node3 30575 1726867650.74917: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867650.74922: Calling groups_plugins_play to load vars for managed_node3 30575 1726867650.78233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867650.83671: done with get_vars() 30575 1726867650.83696: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:30 -0400 (0:00:00.149) 0:01:26.216 ****** 30575 1726867650.83913: entering _queue_task() for managed_node3/service_facts 30575 1726867650.84360: worker is 1 (out of 1 available) 30575 1726867650.84371: exiting _queue_task() for managed_node3/service_facts 30575 1726867650.84387: done queuing things up, now waiting for results queue to drain 30575 1726867650.84389: waiting for pending results... 30575 1726867650.84828: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867650.84883: in run() - task 0affcac9-a3a5-e081-a588-000000001b97 30575 1726867650.84888: variable 'ansible_search_path' from source: unknown 30575 1726867650.84891: variable 'ansible_search_path' from source: unknown 30575 1726867650.84948: calling self._execute() 30575 1726867650.85085: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867650.85089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867650.85092: variable 'omit' from source: magic vars 30575 1726867650.85667: variable 'ansible_distribution_major_version' from source: facts 30575 1726867650.85684: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867650.85688: variable 'omit' from source: magic vars 30575 1726867650.85993: variable 'omit' from source: magic vars 30575 1726867650.86140: variable 'omit' from source: magic vars 30575 1726867650.86262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867650.86496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867650.86514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867650.86532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867650.86544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867650.86575: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867650.86580: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867650.86582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867650.86684: Set connection var ansible_pipelining to False 30575 1726867650.87026: Set connection var ansible_shell_type to sh 30575 1726867650.87033: Set connection var ansible_shell_executable to /bin/sh 30575 1726867650.87039: Set connection var ansible_timeout to 10 30575 1726867650.87044: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867650.87052: Set connection var ansible_connection to ssh 30575 1726867650.87076: variable 'ansible_shell_executable' from source: unknown 30575 1726867650.87083: variable 'ansible_connection' from source: unknown 30575 1726867650.87087: variable 'ansible_module_compression' from source: unknown 30575 1726867650.87089: variable 'ansible_shell_type' from source: unknown 30575 1726867650.87091: variable 'ansible_shell_executable' from source: unknown 30575 1726867650.87094: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867650.87096: variable 'ansible_pipelining' from source: unknown 30575 1726867650.87098: variable 'ansible_timeout' from source: unknown 30575 1726867650.87100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867650.87628: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867650.87633: variable 'omit' from source: magic vars 30575 1726867650.87635: starting attempt loop 30575 1726867650.87638: running the handler 30575 1726867650.87640: _low_level_execute_command(): starting 30575 1726867650.87642: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867650.88438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867650.88442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867650.88444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867650.88447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867650.88689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867650.88693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867650.88695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867650.88698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867650.90325: stdout chunk (state=3): >>>/root <<< 30575 1726867650.90434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867650.90484: stderr chunk (state=3): >>><<< 30575 1726867650.90488: stdout chunk (state=3): >>><<< 30575 1726867650.90586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867650.90600: _low_level_execute_command(): starting 30575 1726867650.90608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798 `" && echo ansible-tmp-1726867650.905862-34657-220661825454798="` echo /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798 `" ) && sleep 0' 30575 1726867650.91699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867650.91871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867650.91945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867650.92049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867650.93942: stdout chunk (state=3): >>>ansible-tmp-1726867650.905862-34657-220661825454798=/root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798 <<< 30575 1726867650.94083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867650.94117: stdout chunk (state=3): >>><<< 30575 1726867650.94120: stderr chunk (state=3): >>><<< 30575 1726867650.94136: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867650.905862-34657-220661825454798=/root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867650.94192: variable 'ansible_module_compression' from source: unknown 30575 1726867650.94282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867650.94293: variable 'ansible_facts' from source: unknown 30575 1726867650.94396: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/AnsiballZ_service_facts.py 30575 1726867650.94754: Sending initial data 30575 1726867650.94757: Sent initial data (161 bytes) 30575 1726867650.95394: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867650.95429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867650.95449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867650.95466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867650.95692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867650.97283: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867650.97382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867650.97396: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpperr0d1h /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/AnsiballZ_service_facts.py <<< 30575 1726867650.97409: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/AnsiballZ_service_facts.py" <<< 30575 1726867650.97448: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpperr0d1h" to remote "/root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/AnsiballZ_service_facts.py" <<< 30575 1726867650.99049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867650.99136: stderr chunk (state=3): >>><<< 30575 1726867650.99386: stdout chunk (state=3): >>><<< 30575 1726867650.99390: done transferring module to remote 30575 1726867650.99395: _low_level_execute_command(): starting 30575 1726867650.99401: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/ /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/AnsiballZ_service_facts.py && sleep 0' 30575 1726867651.00897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867651.00921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867651.00946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867651.00969: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867651.00995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867651.01095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867651.01264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867651.01291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867651.01360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867651.01564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867651.03403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867651.03436: stderr chunk (state=3): >>><<< 30575 1726867651.03440: stdout chunk (state=3): >>><<< 30575 1726867651.03453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867651.03456: _low_level_execute_command(): starting 30575 1726867651.03460: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/AnsiballZ_service_facts.py && sleep 0' 30575 1726867651.03986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867651.03990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867651.03993: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867651.04010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867651.04014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867651.04086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867651.04103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867651.04164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867652.55510: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867652.56956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867652.56960: stdout chunk (state=3): >>><<< 30575 1726867652.56962: stderr chunk (state=3): >>><<< 30575 1726867652.57004: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867652.59109: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867652.59429: _low_level_execute_command(): starting 30575 1726867652.59437: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867650.905862-34657-220661825454798/ > /dev/null 2>&1 && sleep 0' 30575 1726867652.60471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867652.60519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867652.60536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867652.60544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867652.60583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867652.60593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867652.60659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867652.60663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867652.60665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867652.60736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867652.60756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867652.60820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867652.62685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867652.62689: stdout chunk (state=3): >>><<< 30575 1726867652.62691: stderr chunk (state=3): >>><<< 30575 1726867652.62694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867652.62700: handler run complete 30575 1726867652.62908: variable 'ansible_facts' from source: unknown 30575 1726867652.63073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867652.64196: variable 'ansible_facts' from source: unknown 30575 1726867652.64751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867652.64942: attempt loop complete, returning result 30575 1726867652.64948: _execute() done 30575 1726867652.64951: dumping result to json 30575 1726867652.65259: done dumping result, returning 30575 1726867652.65268: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000001b97] 30575 1726867652.65273: sending task result for task 0affcac9-a3a5-e081-a588-000000001b97 30575 1726867652.66984: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b97 30575 1726867652.66988: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867652.67079: no more pending results, returning what we have 30575 1726867652.67082: results queue empty 30575 1726867652.67083: checking for any_errors_fatal 30575 1726867652.67087: done checking for any_errors_fatal 30575 1726867652.67088: checking for max_fail_percentage 30575 1726867652.67089: done checking for max_fail_percentage 30575 1726867652.67090: checking to see if all hosts have failed and the running result is not ok 30575 1726867652.67091: done checking to see if all hosts have failed 30575 1726867652.67091: getting the remaining hosts for this loop 30575 1726867652.67093: done getting the remaining hosts for this loop 30575 1726867652.67096: getting the next task for host managed_node3 30575 1726867652.67103: done getting next task for host managed_node3 30575 1726867652.67107: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867652.67113: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867652.67126: getting variables 30575 1726867652.67127: in VariableManager get_vars() 30575 1726867652.67162: Calling all_inventory to load vars for managed_node3 30575 1726867652.67165: Calling groups_inventory to load vars for managed_node3 30575 1726867652.67167: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867652.67419: Calling all_plugins_play to load vars for managed_node3 30575 1726867652.67423: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867652.67427: Calling groups_plugins_play to load vars for managed_node3 30575 1726867652.70232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867652.78345: done with get_vars() 30575 1726867652.78371: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:32 -0400 (0:00:01.945) 0:01:28.162 ****** 30575 1726867652.78476: entering _queue_task() for managed_node3/package_facts 30575 1726867652.79105: worker is 1 (out of 1 available) 30575 1726867652.79119: exiting _queue_task() for managed_node3/package_facts 30575 1726867652.79131: done queuing things up, now waiting for results queue to drain 30575 1726867652.79133: waiting for pending results... 30575 1726867652.79600: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867652.80008: in run() - task 0affcac9-a3a5-e081-a588-000000001b98 30575 1726867652.80254: variable 'ansible_search_path' from source: unknown 30575 1726867652.80259: variable 'ansible_search_path' from source: unknown 30575 1726867652.80264: calling self._execute() 30575 1726867652.80330: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867652.80349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867652.80403: variable 'omit' from source: magic vars 30575 1726867652.80843: variable 'ansible_distribution_major_version' from source: facts 30575 1726867652.80859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867652.80869: variable 'omit' from source: magic vars 30575 1726867652.80958: variable 'omit' from source: magic vars 30575 1726867652.80996: variable 'omit' from source: magic vars 30575 1726867652.81046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867652.81089: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867652.81114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867652.81137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867652.81162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867652.81199: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867652.81208: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867652.81217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867652.81367: Set connection var ansible_pipelining to False 30575 1726867652.81371: Set connection var ansible_shell_type to sh 30575 1726867652.81374: Set connection var ansible_shell_executable to /bin/sh 30575 1726867652.81376: Set connection var ansible_timeout to 10 30575 1726867652.81381: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867652.81383: Set connection var ansible_connection to ssh 30575 1726867652.81404: variable 'ansible_shell_executable' from source: unknown 30575 1726867652.81414: variable 'ansible_connection' from source: unknown 30575 1726867652.81422: variable 'ansible_module_compression' from source: unknown 30575 1726867652.81429: variable 'ansible_shell_type' from source: unknown 30575 1726867652.81478: variable 'ansible_shell_executable' from source: unknown 30575 1726867652.81482: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867652.81484: variable 'ansible_pipelining' from source: unknown 30575 1726867652.81487: variable 'ansible_timeout' from source: unknown 30575 1726867652.81490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867652.81665: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867652.81691: variable 'omit' from source: magic vars 30575 1726867652.81705: starting attempt loop 30575 1726867652.81783: running the handler 30575 1726867652.81786: _low_level_execute_command(): starting 30575 1726867652.81788: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867652.82536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867652.82649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867652.82666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867652.82888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867652.82961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867652.84632: stdout chunk (state=3): >>>/root <<< 30575 1726867652.84968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867652.84972: stdout chunk (state=3): >>><<< 30575 1726867652.84974: stderr chunk (state=3): >>><<< 30575 1726867652.85280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867652.85284: _low_level_execute_command(): starting 30575 1726867652.85287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320 `" && echo ansible-tmp-1726867652.8518655-34749-163214792878320="` echo /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320 `" ) && sleep 0' 30575 1726867652.86631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867652.86641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867652.86652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867652.86844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867652.86893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867652.86897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867652.86899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867652.87008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867652.88898: stdout chunk (state=3): >>>ansible-tmp-1726867652.8518655-34749-163214792878320=/root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320 <<< 30575 1726867652.89040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867652.89044: stdout chunk (state=3): >>><<< 30575 1726867652.89052: stderr chunk (state=3): >>><<< 30575 1726867652.89073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867652.8518655-34749-163214792878320=/root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867652.89129: variable 'ansible_module_compression' from source: unknown 30575 1726867652.89425: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867652.89481: variable 'ansible_facts' from source: unknown 30575 1726867652.89798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/AnsiballZ_package_facts.py 30575 1726867652.90686: Sending initial data 30575 1726867652.90690: Sent initial data (162 bytes) 30575 1726867652.91457: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867652.91464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867652.91481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867652.91489: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867652.91499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867652.91515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867652.91775: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867652.91782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867652.91785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867652.91885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867652.93423: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867652.93489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867652.93584: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp49w1ocp0 /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/AnsiballZ_package_facts.py <<< 30575 1726867652.93588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/AnsiballZ_package_facts.py" <<< 30575 1726867652.93646: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp49w1ocp0" to remote "/root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/AnsiballZ_package_facts.py" <<< 30575 1726867652.97030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867652.97034: stdout chunk (state=3): >>><<< 30575 1726867652.97043: stderr chunk (state=3): >>><<< 30575 1726867652.97061: done transferring module to remote 30575 1726867652.97072: _low_level_execute_command(): starting 30575 1726867652.97082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/ /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/AnsiballZ_package_facts.py && sleep 0' 30575 1726867652.98507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867652.98564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867652.98633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867652.98681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867652.98697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867652.98721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867652.98798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867653.00939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867653.00942: stdout chunk (state=3): >>><<< 30575 1726867653.00945: stderr chunk (state=3): >>><<< 30575 1726867653.00947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867653.00950: _low_level_execute_command(): starting 30575 1726867653.00952: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/AnsiballZ_package_facts.py && sleep 0' 30575 1726867653.01997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867653.02034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867653.02057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867653.02288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867653.02304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867653.02325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867653.02418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867653.46084: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30575 1726867653.46182: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30575 1726867653.46289: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867653.46302: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867653.48094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867653.48097: stdout chunk (state=3): >>><<< 30575 1726867653.48100: stderr chunk (state=3): >>><<< 30575 1726867653.48291: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867653.50235: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867653.50260: _low_level_execute_command(): starting 30575 1726867653.50272: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867652.8518655-34749-163214792878320/ > /dev/null 2>&1 && sleep 0' 30575 1726867653.50899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867653.50917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867653.50933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867653.50951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867653.50967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867653.50986: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867653.51003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867653.51026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867653.51039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867653.51094: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867653.51140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867653.51158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867653.51192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867653.51262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867653.53146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867653.53149: stdout chunk (state=3): >>><<< 30575 1726867653.53155: stderr chunk (state=3): >>><<< 30575 1726867653.53172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867653.53180: handler run complete 30575 1726867653.54175: variable 'ansible_facts' from source: unknown 30575 1726867653.54814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867653.56800: variable 'ansible_facts' from source: unknown 30575 1726867653.57261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867653.57993: attempt loop complete, returning result 30575 1726867653.58008: _execute() done 30575 1726867653.58011: dumping result to json 30575 1726867653.58482: done dumping result, returning 30575 1726867653.58485: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000001b98] 30575 1726867653.58488: sending task result for task 0affcac9-a3a5-e081-a588-000000001b98 30575 1726867653.61699: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b98 30575 1726867653.61703: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867653.61886: no more pending results, returning what we have 30575 1726867653.61890: results queue empty 30575 1726867653.61890: checking for any_errors_fatal 30575 1726867653.61900: done checking for any_errors_fatal 30575 1726867653.61901: checking for max_fail_percentage 30575 1726867653.61903: done checking for max_fail_percentage 30575 1726867653.61903: checking to see if all hosts have failed and the running result is not ok 30575 1726867653.61907: done checking to see if all hosts have failed 30575 1726867653.61908: getting the remaining hosts for this loop 30575 1726867653.61909: done getting the remaining hosts for this loop 30575 1726867653.61913: getting the next task for host managed_node3 30575 1726867653.61922: done getting next task for host managed_node3 30575 1726867653.61926: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867653.61967: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867653.61986: getting variables 30575 1726867653.61988: in VariableManager get_vars() 30575 1726867653.62024: Calling all_inventory to load vars for managed_node3 30575 1726867653.62027: Calling groups_inventory to load vars for managed_node3 30575 1726867653.62030: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867653.62038: Calling all_plugins_play to load vars for managed_node3 30575 1726867653.62064: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867653.62072: Calling groups_plugins_play to load vars for managed_node3 30575 1726867653.63548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867653.65895: done with get_vars() 30575 1726867653.65938: done getting variables 30575 1726867653.66014: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:33 -0400 (0:00:00.875) 0:01:29.038 ****** 30575 1726867653.66056: entering _queue_task() for managed_node3/debug 30575 1726867653.66457: worker is 1 (out of 1 available) 30575 1726867653.66471: exiting _queue_task() for managed_node3/debug 30575 1726867653.66486: done queuing things up, now waiting for results queue to drain 30575 1726867653.66488: waiting for pending results... 30575 1726867653.67017: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867653.67349: in run() - task 0affcac9-a3a5-e081-a588-000000001b3c 30575 1726867653.67365: variable 'ansible_search_path' from source: unknown 30575 1726867653.67369: variable 'ansible_search_path' from source: unknown 30575 1726867653.67439: calling self._execute() 30575 1726867653.67760: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867653.67767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867653.67770: variable 'omit' from source: magic vars 30575 1726867653.68606: variable 'ansible_distribution_major_version' from source: facts 30575 1726867653.68610: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867653.68613: variable 'omit' from source: magic vars 30575 1726867653.68628: variable 'omit' from source: magic vars 30575 1726867653.68764: variable 'network_provider' from source: set_fact 30575 1726867653.68787: variable 'omit' from source: magic vars 30575 1726867653.68829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867653.68864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867653.68911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867653.68933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867653.68973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867653.69006: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867653.69018: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867653.69022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867653.69129: Set connection var ansible_pipelining to False 30575 1726867653.69132: Set connection var ansible_shell_type to sh 30575 1726867653.69138: Set connection var ansible_shell_executable to /bin/sh 30575 1726867653.69145: Set connection var ansible_timeout to 10 30575 1726867653.69148: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867653.69156: Set connection var ansible_connection to ssh 30575 1726867653.69184: variable 'ansible_shell_executable' from source: unknown 30575 1726867653.69187: variable 'ansible_connection' from source: unknown 30575 1726867653.69190: variable 'ansible_module_compression' from source: unknown 30575 1726867653.69193: variable 'ansible_shell_type' from source: unknown 30575 1726867653.69195: variable 'ansible_shell_executable' from source: unknown 30575 1726867653.69197: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867653.69202: variable 'ansible_pipelining' from source: unknown 30575 1726867653.69205: variable 'ansible_timeout' from source: unknown 30575 1726867653.69212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867653.69372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867653.69376: variable 'omit' from source: magic vars 30575 1726867653.69380: starting attempt loop 30575 1726867653.69383: running the handler 30575 1726867653.69414: handler run complete 30575 1726867653.69457: attempt loop complete, returning result 30575 1726867653.69460: _execute() done 30575 1726867653.69463: dumping result to json 30575 1726867653.69465: done dumping result, returning 30575 1726867653.69467: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000001b3c] 30575 1726867653.69470: sending task result for task 0affcac9-a3a5-e081-a588-000000001b3c ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867653.69755: no more pending results, returning what we have 30575 1726867653.69759: results queue empty 30575 1726867653.69760: checking for any_errors_fatal 30575 1726867653.69770: done checking for any_errors_fatal 30575 1726867653.69771: checking for max_fail_percentage 30575 1726867653.69772: done checking for max_fail_percentage 30575 1726867653.69774: checking to see if all hosts have failed and the running result is not ok 30575 1726867653.69775: done checking to see if all hosts have failed 30575 1726867653.69775: getting the remaining hosts for this loop 30575 1726867653.69781: done getting the remaining hosts for this loop 30575 1726867653.69785: getting the next task for host managed_node3 30575 1726867653.69794: done getting next task for host managed_node3 30575 1726867653.69800: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867653.69805: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867653.69821: getting variables 30575 1726867653.69823: in VariableManager get_vars() 30575 1726867653.69872: Calling all_inventory to load vars for managed_node3 30575 1726867653.69874: Calling groups_inventory to load vars for managed_node3 30575 1726867653.69888: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867653.69895: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b3c 30575 1726867653.69897: WORKER PROCESS EXITING 30575 1726867653.69906: Calling all_plugins_play to load vars for managed_node3 30575 1726867653.69909: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867653.69912: Calling groups_plugins_play to load vars for managed_node3 30575 1726867653.71796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867653.73688: done with get_vars() 30575 1726867653.73722: done getting variables 30575 1726867653.73786: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:33 -0400 (0:00:00.077) 0:01:29.115 ****** 30575 1726867653.73837: entering _queue_task() for managed_node3/fail 30575 1726867653.74360: worker is 1 (out of 1 available) 30575 1726867653.74372: exiting _queue_task() for managed_node3/fail 30575 1726867653.74456: done queuing things up, now waiting for results queue to drain 30575 1726867653.74458: waiting for pending results... 30575 1726867653.74822: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867653.74842: in run() - task 0affcac9-a3a5-e081-a588-000000001b3d 30575 1726867653.74858: variable 'ansible_search_path' from source: unknown 30575 1726867653.74862: variable 'ansible_search_path' from source: unknown 30575 1726867653.74925: calling self._execute() 30575 1726867653.75027: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867653.75031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867653.75084: variable 'omit' from source: magic vars 30575 1726867653.75457: variable 'ansible_distribution_major_version' from source: facts 30575 1726867653.75468: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867653.75711: variable 'network_state' from source: role '' defaults 30575 1726867653.75714: Evaluated conditional (network_state != {}): False 30575 1726867653.75716: when evaluation is False, skipping this task 30575 1726867653.75717: _execute() done 30575 1726867653.75719: dumping result to json 30575 1726867653.75721: done dumping result, returning 30575 1726867653.75723: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-000000001b3d] 30575 1726867653.75725: sending task result for task 0affcac9-a3a5-e081-a588-000000001b3d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867653.75839: no more pending results, returning what we have 30575 1726867653.75844: results queue empty 30575 1726867653.75845: checking for any_errors_fatal 30575 1726867653.75856: done checking for any_errors_fatal 30575 1726867653.75856: checking for max_fail_percentage 30575 1726867653.75859: done checking for max_fail_percentage 30575 1726867653.75860: checking to see if all hosts have failed and the running result is not ok 30575 1726867653.75861: done checking to see if all hosts have failed 30575 1726867653.75862: getting the remaining hosts for this loop 30575 1726867653.75863: done getting the remaining hosts for this loop 30575 1726867653.75867: getting the next task for host managed_node3 30575 1726867653.75879: done getting next task for host managed_node3 30575 1726867653.75884: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867653.75889: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867653.75927: getting variables 30575 1726867653.75929: in VariableManager get_vars() 30575 1726867653.75972: Calling all_inventory to load vars for managed_node3 30575 1726867653.75974: Calling groups_inventory to load vars for managed_node3 30575 1726867653.76183: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867653.76194: Calling all_plugins_play to load vars for managed_node3 30575 1726867653.76197: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867653.76201: Calling groups_plugins_play to load vars for managed_node3 30575 1726867653.76721: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b3d 30575 1726867653.76725: WORKER PROCESS EXITING 30575 1726867653.78724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867653.82003: done with get_vars() 30575 1726867653.82033: done getting variables 30575 1726867653.82214: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:33 -0400 (0:00:00.084) 0:01:29.200 ****** 30575 1726867653.82252: entering _queue_task() for managed_node3/fail 30575 1726867653.83167: worker is 1 (out of 1 available) 30575 1726867653.83263: exiting _queue_task() for managed_node3/fail 30575 1726867653.83276: done queuing things up, now waiting for results queue to drain 30575 1726867653.83280: waiting for pending results... 30575 1726867653.83747: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867653.84243: in run() - task 0affcac9-a3a5-e081-a588-000000001b3e 30575 1726867653.84247: variable 'ansible_search_path' from source: unknown 30575 1726867653.84250: variable 'ansible_search_path' from source: unknown 30575 1726867653.84254: calling self._execute() 30575 1726867653.84345: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867653.84592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867653.84602: variable 'omit' from source: magic vars 30575 1726867653.85263: variable 'ansible_distribution_major_version' from source: facts 30575 1726867653.85274: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867653.85583: variable 'network_state' from source: role '' defaults 30575 1726867653.85586: Evaluated conditional (network_state != {}): False 30575 1726867653.85589: when evaluation is False, skipping this task 30575 1726867653.85594: _execute() done 30575 1726867653.85596: dumping result to json 30575 1726867653.85602: done dumping result, returning 30575 1726867653.85613: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-000000001b3e] 30575 1726867653.85618: sending task result for task 0affcac9-a3a5-e081-a588-000000001b3e 30575 1726867653.85723: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b3e 30575 1726867653.85728: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867653.85784: no more pending results, returning what we have 30575 1726867653.85789: results queue empty 30575 1726867653.85790: checking for any_errors_fatal 30575 1726867653.85799: done checking for any_errors_fatal 30575 1726867653.85800: checking for max_fail_percentage 30575 1726867653.85802: done checking for max_fail_percentage 30575 1726867653.85803: checking to see if all hosts have failed and the running result is not ok 30575 1726867653.85807: done checking to see if all hosts have failed 30575 1726867653.85808: getting the remaining hosts for this loop 30575 1726867653.85809: done getting the remaining hosts for this loop 30575 1726867653.85814: getting the next task for host managed_node3 30575 1726867653.85825: done getting next task for host managed_node3 30575 1726867653.85830: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867653.85836: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867653.85869: getting variables 30575 1726867653.85871: in VariableManager get_vars() 30575 1726867653.85924: Calling all_inventory to load vars for managed_node3 30575 1726867653.85927: Calling groups_inventory to load vars for managed_node3 30575 1726867653.85929: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867653.85942: Calling all_plugins_play to load vars for managed_node3 30575 1726867653.85946: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867653.85949: Calling groups_plugins_play to load vars for managed_node3 30575 1726867653.89080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867653.92311: done with get_vars() 30575 1726867653.92337: done getting variables 30575 1726867653.92618: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:33 -0400 (0:00:00.104) 0:01:29.304 ****** 30575 1726867653.92656: entering _queue_task() for managed_node3/fail 30575 1726867653.93326: worker is 1 (out of 1 available) 30575 1726867653.93339: exiting _queue_task() for managed_node3/fail 30575 1726867653.93352: done queuing things up, now waiting for results queue to drain 30575 1726867653.93353: waiting for pending results... 30575 1726867653.94103: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867653.94388: in run() - task 0affcac9-a3a5-e081-a588-000000001b3f 30575 1726867653.94392: variable 'ansible_search_path' from source: unknown 30575 1726867653.94395: variable 'ansible_search_path' from source: unknown 30575 1726867653.94508: calling self._execute() 30575 1726867653.94931: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867653.94945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867653.95147: variable 'omit' from source: magic vars 30575 1726867653.95956: variable 'ansible_distribution_major_version' from source: facts 30575 1726867653.95996: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867653.96350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867654.03567: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867654.03673: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867654.03764: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867654.03872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867654.03993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867654.04121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.04193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.04300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.04429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.04441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.04704: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.04814: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867654.05010: variable 'ansible_distribution' from source: facts 30575 1726867654.05142: variable '__network_rh_distros' from source: role '' defaults 30575 1726867654.05145: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867654.05712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.05748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.05825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.06032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.06036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.06346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.06375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.06530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.06599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.06618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.06719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.06811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.06838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.06937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.06974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.07770: variable 'network_connections' from source: include params 30575 1726867654.07786: variable 'interface' from source: play vars 30575 1726867654.07858: variable 'interface' from source: play vars 30575 1726867654.08183: variable 'network_state' from source: role '' defaults 30575 1726867654.08186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867654.08451: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867654.08561: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867654.08598: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867654.08662: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867654.08721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867654.08754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867654.08794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.08828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867654.08866: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867654.08875: when evaluation is False, skipping this task 30575 1726867654.08885: _execute() done 30575 1726867654.08892: dumping result to json 30575 1726867654.08899: done dumping result, returning 30575 1726867654.08915: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000001b3f] 30575 1726867654.08925: sending task result for task 0affcac9-a3a5-e081-a588-000000001b3f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867654.09224: no more pending results, returning what we have 30575 1726867654.09227: results queue empty 30575 1726867654.09228: checking for any_errors_fatal 30575 1726867654.09234: done checking for any_errors_fatal 30575 1726867654.09234: checking for max_fail_percentage 30575 1726867654.09236: done checking for max_fail_percentage 30575 1726867654.09237: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.09239: done checking to see if all hosts have failed 30575 1726867654.09239: getting the remaining hosts for this loop 30575 1726867654.09241: done getting the remaining hosts for this loop 30575 1726867654.09244: getting the next task for host managed_node3 30575 1726867654.09252: done getting next task for host managed_node3 30575 1726867654.09257: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867654.09261: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.09291: getting variables 30575 1726867654.09293: in VariableManager get_vars() 30575 1726867654.09338: Calling all_inventory to load vars for managed_node3 30575 1726867654.09340: Calling groups_inventory to load vars for managed_node3 30575 1726867654.09343: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.09354: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.09358: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.09362: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.10007: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b3f 30575 1726867654.10011: WORKER PROCESS EXITING 30575 1726867654.13524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.17784: done with get_vars() 30575 1726867654.17811: done getting variables 30575 1726867654.17871: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:34 -0400 (0:00:00.255) 0:01:29.560 ****** 30575 1726867654.18255: entering _queue_task() for managed_node3/dnf 30575 1726867654.19014: worker is 1 (out of 1 available) 30575 1726867654.19026: exiting _queue_task() for managed_node3/dnf 30575 1726867654.19039: done queuing things up, now waiting for results queue to drain 30575 1726867654.19040: waiting for pending results... 30575 1726867654.19392: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867654.19810: in run() - task 0affcac9-a3a5-e081-a588-000000001b40 30575 1726867654.20066: variable 'ansible_search_path' from source: unknown 30575 1726867654.20070: variable 'ansible_search_path' from source: unknown 30575 1726867654.20074: calling self._execute() 30575 1726867654.20214: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867654.20225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867654.20239: variable 'omit' from source: magic vars 30575 1726867654.21091: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.21153: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867654.21596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867654.26692: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867654.26837: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867654.26976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867654.27195: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867654.27199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867654.27339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.27422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.27453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.27561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.27599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.27958: variable 'ansible_distribution' from source: facts 30575 1726867654.27961: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.27964: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867654.28196: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867654.28444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.28524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.28636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.28682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.28734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.28983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.28986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.29009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.29060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.29263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.29267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.29269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.29373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.29423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.29443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.29861: variable 'network_connections' from source: include params 30575 1726867654.29880: variable 'interface' from source: play vars 30575 1726867654.29959: variable 'interface' from source: play vars 30575 1726867654.30170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867654.30580: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867654.30627: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867654.30662: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867654.30884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867654.30887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867654.30890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867654.31021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.31057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867654.31197: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867654.31647: variable 'network_connections' from source: include params 30575 1726867654.31786: variable 'interface' from source: play vars 30575 1726867654.31982: variable 'interface' from source: play vars 30575 1726867654.31986: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867654.31989: when evaluation is False, skipping this task 30575 1726867654.31992: _execute() done 30575 1726867654.32081: dumping result to json 30575 1726867654.32084: done dumping result, returning 30575 1726867654.32087: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001b40] 30575 1726867654.32089: sending task result for task 0affcac9-a3a5-e081-a588-000000001b40 30575 1726867654.32174: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b40 30575 1726867654.32179: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867654.32230: no more pending results, returning what we have 30575 1726867654.32233: results queue empty 30575 1726867654.32234: checking for any_errors_fatal 30575 1726867654.32243: done checking for any_errors_fatal 30575 1726867654.32244: checking for max_fail_percentage 30575 1726867654.32246: done checking for max_fail_percentage 30575 1726867654.32247: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.32248: done checking to see if all hosts have failed 30575 1726867654.32249: getting the remaining hosts for this loop 30575 1726867654.32250: done getting the remaining hosts for this loop 30575 1726867654.32254: getting the next task for host managed_node3 30575 1726867654.32262: done getting next task for host managed_node3 30575 1726867654.32267: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867654.32272: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.32303: getting variables 30575 1726867654.32305: in VariableManager get_vars() 30575 1726867654.32346: Calling all_inventory to load vars for managed_node3 30575 1726867654.32348: Calling groups_inventory to load vars for managed_node3 30575 1726867654.32350: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.32360: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.32362: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.32364: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.35246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.37252: done with get_vars() 30575 1726867654.37278: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867654.37365: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:34 -0400 (0:00:00.191) 0:01:29.751 ****** 30575 1726867654.37407: entering _queue_task() for managed_node3/yum 30575 1726867654.37982: worker is 1 (out of 1 available) 30575 1726867654.37993: exiting _queue_task() for managed_node3/yum 30575 1726867654.38003: done queuing things up, now waiting for results queue to drain 30575 1726867654.38007: waiting for pending results... 30575 1726867654.38197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867654.38342: in run() - task 0affcac9-a3a5-e081-a588-000000001b41 30575 1726867654.38347: variable 'ansible_search_path' from source: unknown 30575 1726867654.38349: variable 'ansible_search_path' from source: unknown 30575 1726867654.38381: calling self._execute() 30575 1726867654.38493: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867654.38557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867654.38561: variable 'omit' from source: magic vars 30575 1726867654.38950: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.38966: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867654.39171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867654.42107: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867654.42189: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867654.42239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867654.42342: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867654.42345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867654.42404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.42460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.42494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.42539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.42567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.42669: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.42694: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867654.42702: when evaluation is False, skipping this task 30575 1726867654.42779: _execute() done 30575 1726867654.42783: dumping result to json 30575 1726867654.42786: done dumping result, returning 30575 1726867654.42789: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001b41] 30575 1726867654.42791: sending task result for task 0affcac9-a3a5-e081-a588-000000001b41 30575 1726867654.42860: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b41 30575 1726867654.42864: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867654.42931: no more pending results, returning what we have 30575 1726867654.42935: results queue empty 30575 1726867654.42936: checking for any_errors_fatal 30575 1726867654.42944: done checking for any_errors_fatal 30575 1726867654.42945: checking for max_fail_percentage 30575 1726867654.42947: done checking for max_fail_percentage 30575 1726867654.42948: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.42949: done checking to see if all hosts have failed 30575 1726867654.42950: getting the remaining hosts for this loop 30575 1726867654.42952: done getting the remaining hosts for this loop 30575 1726867654.42956: getting the next task for host managed_node3 30575 1726867654.42966: done getting next task for host managed_node3 30575 1726867654.42970: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867654.42975: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.43010: getting variables 30575 1726867654.43013: in VariableManager get_vars() 30575 1726867654.43064: Calling all_inventory to load vars for managed_node3 30575 1726867654.43067: Calling groups_inventory to load vars for managed_node3 30575 1726867654.43070: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.43284: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.43288: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.43292: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.45658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.47191: done with get_vars() 30575 1726867654.47219: done getting variables 30575 1726867654.47282: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:34 -0400 (0:00:00.099) 0:01:29.850 ****** 30575 1726867654.47323: entering _queue_task() for managed_node3/fail 30575 1726867654.47773: worker is 1 (out of 1 available) 30575 1726867654.47787: exiting _queue_task() for managed_node3/fail 30575 1726867654.47798: done queuing things up, now waiting for results queue to drain 30575 1726867654.47800: waiting for pending results... 30575 1726867654.48086: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867654.48202: in run() - task 0affcac9-a3a5-e081-a588-000000001b42 30575 1726867654.48287: variable 'ansible_search_path' from source: unknown 30575 1726867654.48292: variable 'ansible_search_path' from source: unknown 30575 1726867654.48296: calling self._execute() 30575 1726867654.48370: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867654.48386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867654.48411: variable 'omit' from source: magic vars 30575 1726867654.48795: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.48812: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867654.48944: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867654.49134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867654.52163: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867654.52193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867654.52239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867654.52286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867654.52316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867654.52407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.52856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.52929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.52947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.52968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.53188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.53192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.53197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.53230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.53244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.53338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.53510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.53530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.53737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.53740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.53938: variable 'network_connections' from source: include params 30575 1726867654.53941: variable 'interface' from source: play vars 30575 1726867654.53943: variable 'interface' from source: play vars 30575 1726867654.54051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867654.54192: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867654.54231: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867654.54258: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867654.54296: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867654.54339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867654.54361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867654.54587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.54591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867654.54593: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867654.54724: variable 'network_connections' from source: include params 30575 1726867654.54727: variable 'interface' from source: play vars 30575 1726867654.54789: variable 'interface' from source: play vars 30575 1726867654.54827: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867654.54830: when evaluation is False, skipping this task 30575 1726867654.54833: _execute() done 30575 1726867654.54835: dumping result to json 30575 1726867654.54838: done dumping result, returning 30575 1726867654.54847: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001b42] 30575 1726867654.54852: sending task result for task 0affcac9-a3a5-e081-a588-000000001b42 30575 1726867654.55183: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b42 30575 1726867654.55186: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867654.55230: no more pending results, returning what we have 30575 1726867654.55233: results queue empty 30575 1726867654.55234: checking for any_errors_fatal 30575 1726867654.55239: done checking for any_errors_fatal 30575 1726867654.55240: checking for max_fail_percentage 30575 1726867654.55241: done checking for max_fail_percentage 30575 1726867654.55242: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.55243: done checking to see if all hosts have failed 30575 1726867654.55244: getting the remaining hosts for this loop 30575 1726867654.55245: done getting the remaining hosts for this loop 30575 1726867654.55248: getting the next task for host managed_node3 30575 1726867654.55256: done getting next task for host managed_node3 30575 1726867654.55260: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867654.55264: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.55287: getting variables 30575 1726867654.55288: in VariableManager get_vars() 30575 1726867654.55332: Calling all_inventory to load vars for managed_node3 30575 1726867654.55335: Calling groups_inventory to load vars for managed_node3 30575 1726867654.55338: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.55352: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.55355: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.55359: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.56934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.58664: done with get_vars() 30575 1726867654.58696: done getting variables 30575 1726867654.58764: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:34 -0400 (0:00:00.114) 0:01:29.965 ****** 30575 1726867654.58811: entering _queue_task() for managed_node3/package 30575 1726867654.59264: worker is 1 (out of 1 available) 30575 1726867654.59276: exiting _queue_task() for managed_node3/package 30575 1726867654.59341: done queuing things up, now waiting for results queue to drain 30575 1726867654.59343: waiting for pending results... 30575 1726867654.59894: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867654.59899: in run() - task 0affcac9-a3a5-e081-a588-000000001b43 30575 1726867654.59903: variable 'ansible_search_path' from source: unknown 30575 1726867654.59906: variable 'ansible_search_path' from source: unknown 30575 1726867654.60083: calling self._execute() 30575 1726867654.60088: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867654.60091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867654.60094: variable 'omit' from source: magic vars 30575 1726867654.60354: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.60365: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867654.60584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867654.60869: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867654.60925: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867654.60957: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867654.61031: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867654.61284: variable 'network_packages' from source: role '' defaults 30575 1726867654.61288: variable '__network_provider_setup' from source: role '' defaults 30575 1726867654.61290: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867654.61333: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867654.61342: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867654.61398: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867654.61583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867654.65222: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867654.65292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867654.65331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867654.65368: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867654.65396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867654.65487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.65583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.65587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.65589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.65599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.65644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.65667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.65701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.65745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.65758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.65994: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867654.66258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.66261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.66281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.66370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.66385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.66662: variable 'ansible_python' from source: facts 30575 1726867654.66683: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867654.66887: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867654.67039: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867654.67285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.67382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.67425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.67462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.67476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.67527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.67549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.67571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.67833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.67850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.68263: variable 'network_connections' from source: include params 30575 1726867654.68270: variable 'interface' from source: play vars 30575 1726867654.68483: variable 'interface' from source: play vars 30575 1726867654.68606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867654.68635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867654.69083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.69086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867654.69088: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867654.69682: variable 'network_connections' from source: include params 30575 1726867654.69685: variable 'interface' from source: play vars 30575 1726867654.69688: variable 'interface' from source: play vars 30575 1726867654.69690: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867654.69771: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867654.70099: variable 'network_connections' from source: include params 30575 1726867654.70102: variable 'interface' from source: play vars 30575 1726867654.70159: variable 'interface' from source: play vars 30575 1726867654.70181: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867654.70264: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867654.70576: variable 'network_connections' from source: include params 30575 1726867654.70583: variable 'interface' from source: play vars 30575 1726867654.70648: variable 'interface' from source: play vars 30575 1726867654.70836: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867654.70839: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867654.70841: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867654.71006: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867654.71522: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867654.72264: variable 'network_connections' from source: include params 30575 1726867654.72267: variable 'interface' from source: play vars 30575 1726867654.72330: variable 'interface' from source: play vars 30575 1726867654.72338: variable 'ansible_distribution' from source: facts 30575 1726867654.72341: variable '__network_rh_distros' from source: role '' defaults 30575 1726867654.72348: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.72367: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867654.72670: variable 'ansible_distribution' from source: facts 30575 1726867654.72673: variable '__network_rh_distros' from source: role '' defaults 30575 1726867654.72675: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.72680: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867654.72882: variable 'ansible_distribution' from source: facts 30575 1726867654.72885: variable '__network_rh_distros' from source: role '' defaults 30575 1726867654.72888: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.72890: variable 'network_provider' from source: set_fact 30575 1726867654.72892: variable 'ansible_facts' from source: unknown 30575 1726867654.74082: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867654.74085: when evaluation is False, skipping this task 30575 1726867654.74087: _execute() done 30575 1726867654.74089: dumping result to json 30575 1726867654.74091: done dumping result, returning 30575 1726867654.74093: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000001b43] 30575 1726867654.74094: sending task result for task 0affcac9-a3a5-e081-a588-000000001b43 30575 1726867654.74162: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b43 30575 1726867654.74165: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867654.74222: no more pending results, returning what we have 30575 1726867654.74226: results queue empty 30575 1726867654.74227: checking for any_errors_fatal 30575 1726867654.74233: done checking for any_errors_fatal 30575 1726867654.74234: checking for max_fail_percentage 30575 1726867654.74236: done checking for max_fail_percentage 30575 1726867654.74237: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.74238: done checking to see if all hosts have failed 30575 1726867654.74238: getting the remaining hosts for this loop 30575 1726867654.74245: done getting the remaining hosts for this loop 30575 1726867654.74250: getting the next task for host managed_node3 30575 1726867654.74257: done getting next task for host managed_node3 30575 1726867654.74262: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867654.74268: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.74297: getting variables 30575 1726867654.74299: in VariableManager get_vars() 30575 1726867654.74483: Calling all_inventory to load vars for managed_node3 30575 1726867654.74487: Calling groups_inventory to load vars for managed_node3 30575 1726867654.74491: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.74501: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.74504: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.74510: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.76027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.77674: done with get_vars() 30575 1726867654.77699: done getting variables 30575 1726867654.78010: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:34 -0400 (0:00:00.192) 0:01:30.157 ****** 30575 1726867654.78043: entering _queue_task() for managed_node3/package 30575 1726867654.78648: worker is 1 (out of 1 available) 30575 1726867654.78662: exiting _queue_task() for managed_node3/package 30575 1726867654.78675: done queuing things up, now waiting for results queue to drain 30575 1726867654.78676: waiting for pending results... 30575 1726867654.78920: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867654.79057: in run() - task 0affcac9-a3a5-e081-a588-000000001b44 30575 1726867654.79075: variable 'ansible_search_path' from source: unknown 30575 1726867654.79080: variable 'ansible_search_path' from source: unknown 30575 1726867654.79119: calling self._execute() 30575 1726867654.79225: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867654.79231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867654.79250: variable 'omit' from source: magic vars 30575 1726867654.79661: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.79672: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867654.79943: variable 'network_state' from source: role '' defaults 30575 1726867654.79946: Evaluated conditional (network_state != {}): False 30575 1726867654.79948: when evaluation is False, skipping this task 30575 1726867654.79950: _execute() done 30575 1726867654.79952: dumping result to json 30575 1726867654.79953: done dumping result, returning 30575 1726867654.79956: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001b44] 30575 1726867654.79958: sending task result for task 0affcac9-a3a5-e081-a588-000000001b44 30575 1726867654.80022: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b44 30575 1726867654.80025: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867654.80083: no more pending results, returning what we have 30575 1726867654.80087: results queue empty 30575 1726867654.80088: checking for any_errors_fatal 30575 1726867654.80096: done checking for any_errors_fatal 30575 1726867654.80096: checking for max_fail_percentage 30575 1726867654.80099: done checking for max_fail_percentage 30575 1726867654.80099: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.80100: done checking to see if all hosts have failed 30575 1726867654.80101: getting the remaining hosts for this loop 30575 1726867654.80102: done getting the remaining hosts for this loop 30575 1726867654.80108: getting the next task for host managed_node3 30575 1726867654.80116: done getting next task for host managed_node3 30575 1726867654.80121: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867654.80126: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.80191: getting variables 30575 1726867654.80194: in VariableManager get_vars() 30575 1726867654.80235: Calling all_inventory to load vars for managed_node3 30575 1726867654.80238: Calling groups_inventory to load vars for managed_node3 30575 1726867654.80241: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.80251: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.80370: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.80374: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.81930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.83599: done with get_vars() 30575 1726867654.83622: done getting variables 30575 1726867654.83687: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:34 -0400 (0:00:00.056) 0:01:30.214 ****** 30575 1726867654.83724: entering _queue_task() for managed_node3/package 30575 1726867654.84030: worker is 1 (out of 1 available) 30575 1726867654.84042: exiting _queue_task() for managed_node3/package 30575 1726867654.84056: done queuing things up, now waiting for results queue to drain 30575 1726867654.84058: waiting for pending results... 30575 1726867654.84380: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867654.84585: in run() - task 0affcac9-a3a5-e081-a588-000000001b45 30575 1726867654.84590: variable 'ansible_search_path' from source: unknown 30575 1726867654.84593: variable 'ansible_search_path' from source: unknown 30575 1726867654.84596: calling self._execute() 30575 1726867654.84655: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867654.84659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867654.84882: variable 'omit' from source: magic vars 30575 1726867654.85051: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.85063: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867654.85191: variable 'network_state' from source: role '' defaults 30575 1726867654.85382: Evaluated conditional (network_state != {}): False 30575 1726867654.85385: when evaluation is False, skipping this task 30575 1726867654.85387: _execute() done 30575 1726867654.85389: dumping result to json 30575 1726867654.85391: done dumping result, returning 30575 1726867654.85395: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001b45] 30575 1726867654.85397: sending task result for task 0affcac9-a3a5-e081-a588-000000001b45 30575 1726867654.85456: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b45 30575 1726867654.85458: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867654.85515: no more pending results, returning what we have 30575 1726867654.85519: results queue empty 30575 1726867654.85519: checking for any_errors_fatal 30575 1726867654.85525: done checking for any_errors_fatal 30575 1726867654.85526: checking for max_fail_percentage 30575 1726867654.85528: done checking for max_fail_percentage 30575 1726867654.85528: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.85529: done checking to see if all hosts have failed 30575 1726867654.85530: getting the remaining hosts for this loop 30575 1726867654.85531: done getting the remaining hosts for this loop 30575 1726867654.85534: getting the next task for host managed_node3 30575 1726867654.85541: done getting next task for host managed_node3 30575 1726867654.85545: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867654.85550: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.85571: getting variables 30575 1726867654.85572: in VariableManager get_vars() 30575 1726867654.85608: Calling all_inventory to load vars for managed_node3 30575 1726867654.85610: Calling groups_inventory to load vars for managed_node3 30575 1726867654.85612: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.85619: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.85621: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.85624: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.87512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.89265: done with get_vars() 30575 1726867654.89295: done getting variables 30575 1726867654.89408: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:34 -0400 (0:00:00.057) 0:01:30.271 ****** 30575 1726867654.89446: entering _queue_task() for managed_node3/service 30575 1726867654.89761: worker is 1 (out of 1 available) 30575 1726867654.89774: exiting _queue_task() for managed_node3/service 30575 1726867654.89790: done queuing things up, now waiting for results queue to drain 30575 1726867654.89792: waiting for pending results... 30575 1726867654.90099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867654.90251: in run() - task 0affcac9-a3a5-e081-a588-000000001b46 30575 1726867654.90274: variable 'ansible_search_path' from source: unknown 30575 1726867654.90285: variable 'ansible_search_path' from source: unknown 30575 1726867654.90330: calling self._execute() 30575 1726867654.90431: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867654.90446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867654.90461: variable 'omit' from source: magic vars 30575 1726867654.90850: variable 'ansible_distribution_major_version' from source: facts 30575 1726867654.90866: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867654.90990: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867654.91518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867654.93731: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867654.93802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867654.93843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867654.93885: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867654.93921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867654.94001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.94054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.94089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.94137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.94158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.94209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.94241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.94269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.94314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.94335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.94383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867654.94411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867654.94447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.94552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867654.94556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867654.94694: variable 'network_connections' from source: include params 30575 1726867654.94710: variable 'interface' from source: play vars 30575 1726867654.94780: variable 'interface' from source: play vars 30575 1726867654.94853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867654.95023: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867654.95064: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867654.95105: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867654.95141: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867654.95190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867654.95309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867654.95312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867654.95314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867654.95330: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867654.95562: variable 'network_connections' from source: include params 30575 1726867654.95572: variable 'interface' from source: play vars 30575 1726867654.95640: variable 'interface' from source: play vars 30575 1726867654.95669: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867654.95679: when evaluation is False, skipping this task 30575 1726867654.95687: _execute() done 30575 1726867654.95694: dumping result to json 30575 1726867654.95702: done dumping result, returning 30575 1726867654.95713: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001b46] 30575 1726867654.95722: sending task result for task 0affcac9-a3a5-e081-a588-000000001b46 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867654.95902: no more pending results, returning what we have 30575 1726867654.95905: results queue empty 30575 1726867654.95906: checking for any_errors_fatal 30575 1726867654.95913: done checking for any_errors_fatal 30575 1726867654.95913: checking for max_fail_percentage 30575 1726867654.95915: done checking for max_fail_percentage 30575 1726867654.95916: checking to see if all hosts have failed and the running result is not ok 30575 1726867654.95918: done checking to see if all hosts have failed 30575 1726867654.95919: getting the remaining hosts for this loop 30575 1726867654.95920: done getting the remaining hosts for this loop 30575 1726867654.95924: getting the next task for host managed_node3 30575 1726867654.95934: done getting next task for host managed_node3 30575 1726867654.95938: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867654.95943: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867654.95971: getting variables 30575 1726867654.95973: in VariableManager get_vars() 30575 1726867654.96018: Calling all_inventory to load vars for managed_node3 30575 1726867654.96021: Calling groups_inventory to load vars for managed_node3 30575 1726867654.96023: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867654.96035: Calling all_plugins_play to load vars for managed_node3 30575 1726867654.96038: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867654.96041: Calling groups_plugins_play to load vars for managed_node3 30575 1726867654.96691: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b46 30575 1726867654.96694: WORKER PROCESS EXITING 30575 1726867654.97686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867654.99835: done with get_vars() 30575 1726867654.99857: done getting variables 30575 1726867654.99920: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:34 -0400 (0:00:00.105) 0:01:30.377 ****** 30575 1726867654.99955: entering _queue_task() for managed_node3/service 30575 1726867655.00269: worker is 1 (out of 1 available) 30575 1726867655.00286: exiting _queue_task() for managed_node3/service 30575 1726867655.00299: done queuing things up, now waiting for results queue to drain 30575 1726867655.00301: waiting for pending results... 30575 1726867655.00601: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867655.00758: in run() - task 0affcac9-a3a5-e081-a588-000000001b47 30575 1726867655.00780: variable 'ansible_search_path' from source: unknown 30575 1726867655.00790: variable 'ansible_search_path' from source: unknown 30575 1726867655.00834: calling self._execute() 30575 1726867655.00931: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867655.00947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867655.00961: variable 'omit' from source: magic vars 30575 1726867655.01393: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.01409: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867655.01575: variable 'network_provider' from source: set_fact 30575 1726867655.01701: variable 'network_state' from source: role '' defaults 30575 1726867655.01704: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867655.01706: variable 'omit' from source: magic vars 30575 1726867655.01708: variable 'omit' from source: magic vars 30575 1726867655.01709: variable 'network_service_name' from source: role '' defaults 30575 1726867655.01774: variable 'network_service_name' from source: role '' defaults 30575 1726867655.01887: variable '__network_provider_setup' from source: role '' defaults 30575 1726867655.01898: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867655.01965: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867655.01982: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867655.02051: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867655.02276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867655.05518: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867655.05554: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867655.05622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867655.05882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867655.05885: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867655.05938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.06173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.06176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.06180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.06190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.06238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.06265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.06300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.06343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.06363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.06604: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867655.06724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.06753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.06784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.06829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.06848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.06944: variable 'ansible_python' from source: facts 30575 1726867655.06964: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867655.07052: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867655.07133: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867655.07373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.07380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.07382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.07385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.07387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.07515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.07553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.07617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.07726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.07744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.08184: variable 'network_connections' from source: include params 30575 1726867655.08187: variable 'interface' from source: play vars 30575 1726867655.08189: variable 'interface' from source: play vars 30575 1726867655.08361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867655.08695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867655.08885: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867655.08930: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867655.09096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867655.10119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867655.10160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867655.10252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.10297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867655.10344: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867655.10683: variable 'network_connections' from source: include params 30575 1726867655.10689: variable 'interface' from source: play vars 30575 1726867655.10703: variable 'interface' from source: play vars 30575 1726867655.10738: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867655.10824: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867655.11105: variable 'network_connections' from source: include params 30575 1726867655.11118: variable 'interface' from source: play vars 30575 1726867655.11197: variable 'interface' from source: play vars 30575 1726867655.11221: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867655.11308: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867655.11609: variable 'network_connections' from source: include params 30575 1726867655.11621: variable 'interface' from source: play vars 30575 1726867655.11703: variable 'interface' from source: play vars 30575 1726867655.11784: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867655.11827: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867655.11838: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867655.11905: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867655.12183: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867655.12619: variable 'network_connections' from source: include params 30575 1726867655.12631: variable 'interface' from source: play vars 30575 1726867655.12702: variable 'interface' from source: play vars 30575 1726867655.12714: variable 'ansible_distribution' from source: facts 30575 1726867655.12722: variable '__network_rh_distros' from source: role '' defaults 30575 1726867655.12731: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.12746: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867655.13239: variable 'ansible_distribution' from source: facts 30575 1726867655.13242: variable '__network_rh_distros' from source: role '' defaults 30575 1726867655.13244: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.13246: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867655.13340: variable 'ansible_distribution' from source: facts 30575 1726867655.13566: variable '__network_rh_distros' from source: role '' defaults 30575 1726867655.13569: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.13571: variable 'network_provider' from source: set_fact 30575 1726867655.13573: variable 'omit' from source: magic vars 30575 1726867655.13575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867655.13728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867655.13749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867655.13802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867655.14084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867655.14087: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867655.14089: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867655.14091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867655.14093: Set connection var ansible_pipelining to False 30575 1726867655.14095: Set connection var ansible_shell_type to sh 30575 1726867655.14097: Set connection var ansible_shell_executable to /bin/sh 30575 1726867655.14099: Set connection var ansible_timeout to 10 30575 1726867655.14101: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867655.14103: Set connection var ansible_connection to ssh 30575 1726867655.14105: variable 'ansible_shell_executable' from source: unknown 30575 1726867655.14107: variable 'ansible_connection' from source: unknown 30575 1726867655.14108: variable 'ansible_module_compression' from source: unknown 30575 1726867655.14110: variable 'ansible_shell_type' from source: unknown 30575 1726867655.14112: variable 'ansible_shell_executable' from source: unknown 30575 1726867655.14114: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867655.14416: variable 'ansible_pipelining' from source: unknown 30575 1726867655.14424: variable 'ansible_timeout' from source: unknown 30575 1726867655.14432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867655.14612: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867655.14637: variable 'omit' from source: magic vars 30575 1726867655.14649: starting attempt loop 30575 1726867655.14783: running the handler 30575 1726867655.14827: variable 'ansible_facts' from source: unknown 30575 1726867655.16291: _low_level_execute_command(): starting 30575 1726867655.16303: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867655.16971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867655.16995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867655.17027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867655.17119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867655.17143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867655.17161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867655.17445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867655.19369: stdout chunk (state=3): >>>/root <<< 30575 1726867655.19372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867655.19375: stdout chunk (state=3): >>><<< 30575 1726867655.19379: stderr chunk (state=3): >>><<< 30575 1726867655.19383: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867655.19385: _low_level_execute_command(): starting 30575 1726867655.19387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489 `" && echo ansible-tmp-1726867655.1934757-34873-22175641355489="` echo /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489 `" ) && sleep 0' 30575 1726867655.20572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867655.20794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867655.20832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867655.22734: stdout chunk (state=3): >>>ansible-tmp-1726867655.1934757-34873-22175641355489=/root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489 <<< 30575 1726867655.23092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867655.23096: stdout chunk (state=3): >>><<< 30575 1726867655.23098: stderr chunk (state=3): >>><<< 30575 1726867655.23101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867655.1934757-34873-22175641355489=/root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867655.23393: variable 'ansible_module_compression' from source: unknown 30575 1726867655.23397: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867655.23795: variable 'ansible_facts' from source: unknown 30575 1726867655.24125: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/AnsiballZ_systemd.py 30575 1726867655.24407: Sending initial data 30575 1726867655.24417: Sent initial data (155 bytes) 30575 1726867655.25391: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867655.25499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867655.25550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867655.25634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867655.25670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867655.27308: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867655.27313: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867655.27357: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4ocn26em /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/AnsiballZ_systemd.py <<< 30575 1726867655.27360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/AnsiballZ_systemd.py" <<< 30575 1726867655.27509: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4ocn26em" to remote "/root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/AnsiballZ_systemd.py" <<< 30575 1726867655.30143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867655.30379: stderr chunk (state=3): >>><<< 30575 1726867655.30390: stdout chunk (state=3): >>><<< 30575 1726867655.30487: done transferring module to remote 30575 1726867655.30504: _low_level_execute_command(): starting 30575 1726867655.30586: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/ /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/AnsiballZ_systemd.py && sleep 0' 30575 1726867655.31436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867655.31440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867655.31443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867655.31445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867655.31567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867655.31609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867655.33452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867655.33456: stdout chunk (state=3): >>><<< 30575 1726867655.33495: stderr chunk (state=3): >>><<< 30575 1726867655.33599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867655.33605: _low_level_execute_command(): starting 30575 1726867655.33608: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/AnsiballZ_systemd.py && sleep 0' 30575 1726867655.34783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867655.34786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867655.34789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867655.34806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867655.34812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867655.34992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867655.35013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867655.35091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867655.64005: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317903360", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1935603000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30575 1726867655.64015: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system<<< 30575 1726867655.64018: stdout chunk (state=3): >>>.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867655.65837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867655.65841: stderr chunk (state=3): >>><<< 30575 1726867655.65844: stdout chunk (state=3): >>><<< 30575 1726867655.65863: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317903360", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1935603000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867655.66287: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867655.66290: _low_level_execute_command(): starting 30575 1726867655.66293: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867655.1934757-34873-22175641355489/ > /dev/null 2>&1 && sleep 0' 30575 1726867655.67486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867655.67490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867655.67492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867655.67495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867655.67497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867655.67499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867655.67693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867655.67717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867655.67788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867655.69767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867655.69771: stdout chunk (state=3): >>><<< 30575 1726867655.69773: stderr chunk (state=3): >>><<< 30575 1726867655.69776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867655.69780: handler run complete 30575 1726867655.69966: attempt loop complete, returning result 30575 1726867655.69970: _execute() done 30575 1726867655.69972: dumping result to json 30575 1726867655.69974: done dumping result, returning 30575 1726867655.69976: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000001b47] 30575 1726867655.69987: sending task result for task 0affcac9-a3a5-e081-a588-000000001b47 30575 1726867655.70443: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b47 30575 1726867655.70446: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867655.70507: no more pending results, returning what we have 30575 1726867655.70511: results queue empty 30575 1726867655.70511: checking for any_errors_fatal 30575 1726867655.70517: done checking for any_errors_fatal 30575 1726867655.70517: checking for max_fail_percentage 30575 1726867655.70519: done checking for max_fail_percentage 30575 1726867655.70520: checking to see if all hosts have failed and the running result is not ok 30575 1726867655.70521: done checking to see if all hosts have failed 30575 1726867655.70521: getting the remaining hosts for this loop 30575 1726867655.70522: done getting the remaining hosts for this loop 30575 1726867655.70526: getting the next task for host managed_node3 30575 1726867655.70532: done getting next task for host managed_node3 30575 1726867655.70535: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867655.70539: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867655.70552: getting variables 30575 1726867655.70554: in VariableManager get_vars() 30575 1726867655.70623: Calling all_inventory to load vars for managed_node3 30575 1726867655.70626: Calling groups_inventory to load vars for managed_node3 30575 1726867655.70628: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867655.70637: Calling all_plugins_play to load vars for managed_node3 30575 1726867655.70639: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867655.70642: Calling groups_plugins_play to load vars for managed_node3 30575 1726867655.72723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867655.74650: done with get_vars() 30575 1726867655.74680: done getting variables 30575 1726867655.74746: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:35 -0400 (0:00:00.748) 0:01:31.125 ****** 30575 1726867655.74795: entering _queue_task() for managed_node3/service 30575 1726867655.75094: worker is 1 (out of 1 available) 30575 1726867655.75109: exiting _queue_task() for managed_node3/service 30575 1726867655.75123: done queuing things up, now waiting for results queue to drain 30575 1726867655.75125: waiting for pending results... 30575 1726867655.75321: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867655.75421: in run() - task 0affcac9-a3a5-e081-a588-000000001b48 30575 1726867655.75432: variable 'ansible_search_path' from source: unknown 30575 1726867655.75435: variable 'ansible_search_path' from source: unknown 30575 1726867655.75495: calling self._execute() 30575 1726867655.75582: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867655.75586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867655.75589: variable 'omit' from source: magic vars 30575 1726867655.75993: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.75997: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867655.76089: variable 'network_provider' from source: set_fact 30575 1726867655.76098: Evaluated conditional (network_provider == "nm"): True 30575 1726867655.76183: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867655.76288: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867655.76429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867655.78456: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867655.78530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867655.78565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867655.78598: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867655.78625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867655.78833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.78860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.78889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.79027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.79031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.79192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.79226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.79245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.79352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.79356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.79359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867655.79362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867655.79594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.79646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867655.79654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867655.80014: variable 'network_connections' from source: include params 30575 1726867655.80025: variable 'interface' from source: play vars 30575 1726867655.80118: variable 'interface' from source: play vars 30575 1726867655.80165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867655.80392: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867655.80445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867655.80554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867655.80558: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867655.80560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867655.80584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867655.80614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867655.80640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867655.80688: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867655.80965: variable 'network_connections' from source: include params 30575 1726867655.80968: variable 'interface' from source: play vars 30575 1726867655.81033: variable 'interface' from source: play vars 30575 1726867655.81061: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867655.81064: when evaluation is False, skipping this task 30575 1726867655.81067: _execute() done 30575 1726867655.81069: dumping result to json 30575 1726867655.81072: done dumping result, returning 30575 1726867655.81091: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000001b48] 30575 1726867655.81189: sending task result for task 0affcac9-a3a5-e081-a588-000000001b48 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867655.81296: no more pending results, returning what we have 30575 1726867655.81299: results queue empty 30575 1726867655.81300: checking for any_errors_fatal 30575 1726867655.81321: done checking for any_errors_fatal 30575 1726867655.81323: checking for max_fail_percentage 30575 1726867655.81324: done checking for max_fail_percentage 30575 1726867655.81325: checking to see if all hosts have failed and the running result is not ok 30575 1726867655.81326: done checking to see if all hosts have failed 30575 1726867655.81327: getting the remaining hosts for this loop 30575 1726867655.81328: done getting the remaining hosts for this loop 30575 1726867655.81332: getting the next task for host managed_node3 30575 1726867655.81339: done getting next task for host managed_node3 30575 1726867655.81343: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867655.81348: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867655.81369: getting variables 30575 1726867655.81371: in VariableManager get_vars() 30575 1726867655.81410: Calling all_inventory to load vars for managed_node3 30575 1726867655.81412: Calling groups_inventory to load vars for managed_node3 30575 1726867655.81414: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867655.81506: Calling all_plugins_play to load vars for managed_node3 30575 1726867655.81511: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867655.81516: Calling groups_plugins_play to load vars for managed_node3 30575 1726867655.82039: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b48 30575 1726867655.82042: WORKER PROCESS EXITING 30575 1726867655.83173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867655.85373: done with get_vars() 30575 1726867655.85396: done getting variables 30575 1726867655.85458: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:35 -0400 (0:00:00.106) 0:01:31.232 ****** 30575 1726867655.85494: entering _queue_task() for managed_node3/service 30575 1726867655.85816: worker is 1 (out of 1 available) 30575 1726867655.85829: exiting _queue_task() for managed_node3/service 30575 1726867655.85842: done queuing things up, now waiting for results queue to drain 30575 1726867655.85844: waiting for pending results... 30575 1726867655.86202: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867655.86275: in run() - task 0affcac9-a3a5-e081-a588-000000001b49 30575 1726867655.86295: variable 'ansible_search_path' from source: unknown 30575 1726867655.86299: variable 'ansible_search_path' from source: unknown 30575 1726867655.86340: calling self._execute() 30575 1726867655.86621: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867655.86625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867655.86628: variable 'omit' from source: magic vars 30575 1726867655.86818: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.86827: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867655.87054: variable 'network_provider' from source: set_fact 30575 1726867655.87057: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867655.87060: when evaluation is False, skipping this task 30575 1726867655.87061: _execute() done 30575 1726867655.87063: dumping result to json 30575 1726867655.87065: done dumping result, returning 30575 1726867655.87067: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000001b49] 30575 1726867655.87068: sending task result for task 0affcac9-a3a5-e081-a588-000000001b49 30575 1726867655.87133: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b49 30575 1726867655.87136: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867655.87175: no more pending results, returning what we have 30575 1726867655.87180: results queue empty 30575 1726867655.87181: checking for any_errors_fatal 30575 1726867655.87189: done checking for any_errors_fatal 30575 1726867655.87190: checking for max_fail_percentage 30575 1726867655.87192: done checking for max_fail_percentage 30575 1726867655.87193: checking to see if all hosts have failed and the running result is not ok 30575 1726867655.87194: done checking to see if all hosts have failed 30575 1726867655.87194: getting the remaining hosts for this loop 30575 1726867655.87196: done getting the remaining hosts for this loop 30575 1726867655.87200: getting the next task for host managed_node3 30575 1726867655.87208: done getting next task for host managed_node3 30575 1726867655.87212: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867655.87216: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867655.87242: getting variables 30575 1726867655.87243: in VariableManager get_vars() 30575 1726867655.87285: Calling all_inventory to load vars for managed_node3 30575 1726867655.87287: Calling groups_inventory to load vars for managed_node3 30575 1726867655.87290: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867655.87301: Calling all_plugins_play to load vars for managed_node3 30575 1726867655.87307: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867655.87310: Calling groups_plugins_play to load vars for managed_node3 30575 1726867655.88810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867655.90331: done with get_vars() 30575 1726867655.90354: done getting variables 30575 1726867655.90415: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:35 -0400 (0:00:00.049) 0:01:31.282 ****** 30575 1726867655.90451: entering _queue_task() for managed_node3/copy 30575 1726867655.91202: worker is 1 (out of 1 available) 30575 1726867655.91221: exiting _queue_task() for managed_node3/copy 30575 1726867655.91234: done queuing things up, now waiting for results queue to drain 30575 1726867655.91236: waiting for pending results... 30575 1726867655.91749: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867655.91994: in run() - task 0affcac9-a3a5-e081-a588-000000001b4a 30575 1726867655.92225: variable 'ansible_search_path' from source: unknown 30575 1726867655.92229: variable 'ansible_search_path' from source: unknown 30575 1726867655.92231: calling self._execute() 30575 1726867655.92233: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867655.92338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867655.92351: variable 'omit' from source: magic vars 30575 1726867655.92939: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.92956: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867655.93080: variable 'network_provider' from source: set_fact 30575 1726867655.93097: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867655.93104: when evaluation is False, skipping this task 30575 1726867655.93113: _execute() done 30575 1726867655.93121: dumping result to json 30575 1726867655.93127: done dumping result, returning 30575 1726867655.93139: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000001b4a] 30575 1726867655.93148: sending task result for task 0affcac9-a3a5-e081-a588-000000001b4a skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867655.93300: no more pending results, returning what we have 30575 1726867655.93307: results queue empty 30575 1726867655.93308: checking for any_errors_fatal 30575 1726867655.93316: done checking for any_errors_fatal 30575 1726867655.93317: checking for max_fail_percentage 30575 1726867655.93318: done checking for max_fail_percentage 30575 1726867655.93319: checking to see if all hosts have failed and the running result is not ok 30575 1726867655.93320: done checking to see if all hosts have failed 30575 1726867655.93321: getting the remaining hosts for this loop 30575 1726867655.93323: done getting the remaining hosts for this loop 30575 1726867655.93326: getting the next task for host managed_node3 30575 1726867655.93334: done getting next task for host managed_node3 30575 1726867655.93338: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867655.93343: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867655.93580: getting variables 30575 1726867655.93582: in VariableManager get_vars() 30575 1726867655.93623: Calling all_inventory to load vars for managed_node3 30575 1726867655.93626: Calling groups_inventory to load vars for managed_node3 30575 1726867655.93633: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867655.93639: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b4a 30575 1726867655.93642: WORKER PROCESS EXITING 30575 1726867655.93660: Calling all_plugins_play to load vars for managed_node3 30575 1726867655.93663: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867655.93673: Calling groups_plugins_play to load vars for managed_node3 30575 1726867655.94913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867655.95867: done with get_vars() 30575 1726867655.95884: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:35 -0400 (0:00:00.054) 0:01:31.337 ****** 30575 1726867655.95945: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867655.96154: worker is 1 (out of 1 available) 30575 1726867655.96169: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867655.96183: done queuing things up, now waiting for results queue to drain 30575 1726867655.96185: waiting for pending results... 30575 1726867655.96359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867655.96489: in run() - task 0affcac9-a3a5-e081-a588-000000001b4b 30575 1726867655.96503: variable 'ansible_search_path' from source: unknown 30575 1726867655.96508: variable 'ansible_search_path' from source: unknown 30575 1726867655.96556: calling self._execute() 30575 1726867655.96883: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867655.96887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867655.96890: variable 'omit' from source: magic vars 30575 1726867655.97036: variable 'ansible_distribution_major_version' from source: facts 30575 1726867655.97046: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867655.97052: variable 'omit' from source: magic vars 30575 1726867655.97125: variable 'omit' from source: magic vars 30575 1726867655.97268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867656.03873: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867656.03924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867656.03951: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867656.03974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867656.03995: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867656.04054: variable 'network_provider' from source: set_fact 30575 1726867656.04136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867656.04159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867656.04178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867656.04206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867656.04219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867656.04269: variable 'omit' from source: magic vars 30575 1726867656.04341: variable 'omit' from source: magic vars 30575 1726867656.04412: variable 'network_connections' from source: include params 30575 1726867656.04419: variable 'interface' from source: play vars 30575 1726867656.04462: variable 'interface' from source: play vars 30575 1726867656.04552: variable 'omit' from source: magic vars 30575 1726867656.04558: variable '__lsr_ansible_managed' from source: task vars 30575 1726867656.04603: variable '__lsr_ansible_managed' from source: task vars 30575 1726867656.04721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867656.04845: Loaded config def from plugin (lookup/template) 30575 1726867656.04848: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867656.04867: File lookup term: get_ansible_managed.j2 30575 1726867656.04869: variable 'ansible_search_path' from source: unknown 30575 1726867656.04873: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867656.04884: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867656.04897: variable 'ansible_search_path' from source: unknown 30575 1726867656.07933: variable 'ansible_managed' from source: unknown 30575 1726867656.08019: variable 'omit' from source: magic vars 30575 1726867656.08036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867656.08053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867656.08063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867656.08074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.08083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.08097: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867656.08100: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.08102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.08161: Set connection var ansible_pipelining to False 30575 1726867656.08165: Set connection var ansible_shell_type to sh 30575 1726867656.08170: Set connection var ansible_shell_executable to /bin/sh 30575 1726867656.08175: Set connection var ansible_timeout to 10 30575 1726867656.08182: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867656.08188: Set connection var ansible_connection to ssh 30575 1726867656.08205: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.08210: variable 'ansible_connection' from source: unknown 30575 1726867656.08213: variable 'ansible_module_compression' from source: unknown 30575 1726867656.08215: variable 'ansible_shell_type' from source: unknown 30575 1726867656.08217: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.08220: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.08224: variable 'ansible_pipelining' from source: unknown 30575 1726867656.08227: variable 'ansible_timeout' from source: unknown 30575 1726867656.08231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.08313: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867656.08324: variable 'omit' from source: magic vars 30575 1726867656.08327: starting attempt loop 30575 1726867656.08330: running the handler 30575 1726867656.08337: _low_level_execute_command(): starting 30575 1726867656.08342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867656.08835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.08838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867656.08841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.08843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867656.08845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.08888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867656.08898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867656.08907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.08966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867656.10642: stdout chunk (state=3): >>>/root <<< 30575 1726867656.10764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867656.10767: stdout chunk (state=3): >>><<< 30575 1726867656.10776: stderr chunk (state=3): >>><<< 30575 1726867656.10796: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867656.10804: _low_level_execute_command(): starting 30575 1726867656.10810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294 `" && echo ansible-tmp-1726867656.1079543-34944-1155348327294="` echo /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294 `" ) && sleep 0' 30575 1726867656.11238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.11241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867656.11243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.11246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.11247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.11293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867656.11296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.11347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867656.13224: stdout chunk (state=3): >>>ansible-tmp-1726867656.1079543-34944-1155348327294=/root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294 <<< 30575 1726867656.13331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867656.13354: stderr chunk (state=3): >>><<< 30575 1726867656.13357: stdout chunk (state=3): >>><<< 30575 1726867656.13370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867656.1079543-34944-1155348327294=/root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867656.13403: variable 'ansible_module_compression' from source: unknown 30575 1726867656.13433: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867656.13451: variable 'ansible_facts' from source: unknown 30575 1726867656.13517: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/AnsiballZ_network_connections.py 30575 1726867656.13610: Sending initial data 30575 1726867656.13613: Sent initial data (166 bytes) 30575 1726867656.14030: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.14033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867656.14041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.14043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.14045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867656.14047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.14091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867656.14094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.14142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867656.15673: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867656.15714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867656.15760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpcj8yh_2k /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/AnsiballZ_network_connections.py <<< 30575 1726867656.15763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/AnsiballZ_network_connections.py" <<< 30575 1726867656.15800: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpcj8yh_2k" to remote "/root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/AnsiballZ_network_connections.py" <<< 30575 1726867656.16525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867656.16559: stderr chunk (state=3): >>><<< 30575 1726867656.16563: stdout chunk (state=3): >>><<< 30575 1726867656.16593: done transferring module to remote 30575 1726867656.16601: _low_level_execute_command(): starting 30575 1726867656.16603: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/ /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/AnsiballZ_network_connections.py && sleep 0' 30575 1726867656.17025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.17028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867656.17030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.17033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.17117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867656.17120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.17155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867656.18889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867656.18907: stderr chunk (state=3): >>><<< 30575 1726867656.18911: stdout chunk (state=3): >>><<< 30575 1726867656.18924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867656.18927: _low_level_execute_command(): starting 30575 1726867656.18930: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/AnsiballZ_network_connections.py && sleep 0' 30575 1726867656.19601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867656.19604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.19608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.19611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867656.19615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867656.19617: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867656.19619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.19621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867656.19624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.19630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867656.19632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.19670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867656.46489: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3uhdenov/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3uhdenov/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/907d8824-891a-4719-b02a-cbadb34e89d9: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867656.48148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867656.48155: stdout chunk (state=3): >>><<< 30575 1726867656.48158: stderr chunk (state=3): >>><<< 30575 1726867656.48185: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3uhdenov/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3uhdenov/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/907d8824-891a-4719-b02a-cbadb34e89d9: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867656.48247: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867656.48285: _low_level_execute_command(): starting 30575 1726867656.48289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867656.1079543-34944-1155348327294/ > /dev/null 2>&1 && sleep 0' 30575 1726867656.49043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867656.49058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.49073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.49113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867656.49224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867656.49253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867656.49274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.49389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867656.51235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867656.51284: stderr chunk (state=3): >>><<< 30575 1726867656.51293: stdout chunk (state=3): >>><<< 30575 1726867656.51317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867656.51336: handler run complete 30575 1726867656.51366: attempt loop complete, returning result 30575 1726867656.51373: _execute() done 30575 1726867656.51482: dumping result to json 30575 1726867656.51486: done dumping result, returning 30575 1726867656.51488: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-000000001b4b] 30575 1726867656.51490: sending task result for task 0affcac9-a3a5-e081-a588-000000001b4b 30575 1726867656.51567: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b4b 30575 1726867656.51570: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30575 1726867656.51674: no more pending results, returning what we have 30575 1726867656.51680: results queue empty 30575 1726867656.51681: checking for any_errors_fatal 30575 1726867656.51689: done checking for any_errors_fatal 30575 1726867656.51690: checking for max_fail_percentage 30575 1726867656.51695: done checking for max_fail_percentage 30575 1726867656.51696: checking to see if all hosts have failed and the running result is not ok 30575 1726867656.51697: done checking to see if all hosts have failed 30575 1726867656.51698: getting the remaining hosts for this loop 30575 1726867656.51700: done getting the remaining hosts for this loop 30575 1726867656.51704: getting the next task for host managed_node3 30575 1726867656.51712: done getting next task for host managed_node3 30575 1726867656.51716: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867656.51721: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867656.51735: getting variables 30575 1726867656.51737: in VariableManager get_vars() 30575 1726867656.51775: Calling all_inventory to load vars for managed_node3 30575 1726867656.51986: Calling groups_inventory to load vars for managed_node3 30575 1726867656.51990: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867656.52004: Calling all_plugins_play to load vars for managed_node3 30575 1726867656.52007: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867656.52013: Calling groups_plugins_play to load vars for managed_node3 30575 1726867656.65391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867656.66999: done with get_vars() 30575 1726867656.67027: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:36 -0400 (0:00:00.711) 0:01:32.048 ****** 30575 1726867656.67100: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867656.67706: worker is 1 (out of 1 available) 30575 1726867656.67719: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867656.67729: done queuing things up, now waiting for results queue to drain 30575 1726867656.67731: waiting for pending results... 30575 1726867656.67973: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867656.68024: in run() - task 0affcac9-a3a5-e081-a588-000000001b4c 30575 1726867656.68044: variable 'ansible_search_path' from source: unknown 30575 1726867656.68053: variable 'ansible_search_path' from source: unknown 30575 1726867656.68101: calling self._execute() 30575 1726867656.68286: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.68292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.68295: variable 'omit' from source: magic vars 30575 1726867656.68653: variable 'ansible_distribution_major_version' from source: facts 30575 1726867656.68672: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867656.68808: variable 'network_state' from source: role '' defaults 30575 1726867656.68832: Evaluated conditional (network_state != {}): False 30575 1726867656.68840: when evaluation is False, skipping this task 30575 1726867656.68848: _execute() done 30575 1726867656.68856: dumping result to json 30575 1726867656.68865: done dumping result, returning 30575 1726867656.68879: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000001b4c] 30575 1726867656.68890: sending task result for task 0affcac9-a3a5-e081-a588-000000001b4c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867656.69173: no more pending results, returning what we have 30575 1726867656.69180: results queue empty 30575 1726867656.69181: checking for any_errors_fatal 30575 1726867656.69200: done checking for any_errors_fatal 30575 1726867656.69201: checking for max_fail_percentage 30575 1726867656.69203: done checking for max_fail_percentage 30575 1726867656.69204: checking to see if all hosts have failed and the running result is not ok 30575 1726867656.69205: done checking to see if all hosts have failed 30575 1726867656.69206: getting the remaining hosts for this loop 30575 1726867656.69208: done getting the remaining hosts for this loop 30575 1726867656.69215: getting the next task for host managed_node3 30575 1726867656.69223: done getting next task for host managed_node3 30575 1726867656.69227: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867656.69233: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867656.69266: getting variables 30575 1726867656.69268: in VariableManager get_vars() 30575 1726867656.69427: Calling all_inventory to load vars for managed_node3 30575 1726867656.69430: Calling groups_inventory to load vars for managed_node3 30575 1726867656.69432: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867656.69439: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b4c 30575 1726867656.69441: WORKER PROCESS EXITING 30575 1726867656.69452: Calling all_plugins_play to load vars for managed_node3 30575 1726867656.69455: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867656.69458: Calling groups_plugins_play to load vars for managed_node3 30575 1726867656.70963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867656.72727: done with get_vars() 30575 1726867656.72747: done getting variables 30575 1726867656.72812: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:36 -0400 (0:00:00.057) 0:01:32.106 ****** 30575 1726867656.72848: entering _queue_task() for managed_node3/debug 30575 1726867656.73170: worker is 1 (out of 1 available) 30575 1726867656.73323: exiting _queue_task() for managed_node3/debug 30575 1726867656.73334: done queuing things up, now waiting for results queue to drain 30575 1726867656.73336: waiting for pending results... 30575 1726867656.73518: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867656.73691: in run() - task 0affcac9-a3a5-e081-a588-000000001b4d 30575 1726867656.73710: variable 'ansible_search_path' from source: unknown 30575 1726867656.73722: variable 'ansible_search_path' from source: unknown 30575 1726867656.73767: calling self._execute() 30575 1726867656.73873: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.73891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.73909: variable 'omit' from source: magic vars 30575 1726867656.74323: variable 'ansible_distribution_major_version' from source: facts 30575 1726867656.74341: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867656.74382: variable 'omit' from source: magic vars 30575 1726867656.74431: variable 'omit' from source: magic vars 30575 1726867656.74472: variable 'omit' from source: magic vars 30575 1726867656.74524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867656.74567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867656.74629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867656.74633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.74641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.74738: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867656.74742: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.74744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.74810: Set connection var ansible_pipelining to False 30575 1726867656.74822: Set connection var ansible_shell_type to sh 30575 1726867656.74834: Set connection var ansible_shell_executable to /bin/sh 30575 1726867656.74849: Set connection var ansible_timeout to 10 30575 1726867656.74864: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867656.74879: Set connection var ansible_connection to ssh 30575 1726867656.74907: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.74919: variable 'ansible_connection' from source: unknown 30575 1726867656.74928: variable 'ansible_module_compression' from source: unknown 30575 1726867656.74936: variable 'ansible_shell_type' from source: unknown 30575 1726867656.74954: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.74957: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.74971: variable 'ansible_pipelining' from source: unknown 30575 1726867656.74973: variable 'ansible_timeout' from source: unknown 30575 1726867656.74976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.75174: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867656.75180: variable 'omit' from source: magic vars 30575 1726867656.75182: starting attempt loop 30575 1726867656.75186: running the handler 30575 1726867656.75302: variable '__network_connections_result' from source: set_fact 30575 1726867656.75360: handler run complete 30575 1726867656.75390: attempt loop complete, returning result 30575 1726867656.75398: _execute() done 30575 1726867656.75483: dumping result to json 30575 1726867656.75486: done dumping result, returning 30575 1726867656.75489: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-000000001b4d] 30575 1726867656.75492: sending task result for task 0affcac9-a3a5-e081-a588-000000001b4d 30575 1726867656.75562: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b4d 30575 1726867656.75565: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 30575 1726867656.75646: no more pending results, returning what we have 30575 1726867656.75650: results queue empty 30575 1726867656.75651: checking for any_errors_fatal 30575 1726867656.75657: done checking for any_errors_fatal 30575 1726867656.75658: checking for max_fail_percentage 30575 1726867656.75660: done checking for max_fail_percentage 30575 1726867656.75661: checking to see if all hosts have failed and the running result is not ok 30575 1726867656.75662: done checking to see if all hosts have failed 30575 1726867656.75663: getting the remaining hosts for this loop 30575 1726867656.75664: done getting the remaining hosts for this loop 30575 1726867656.75668: getting the next task for host managed_node3 30575 1726867656.75882: done getting next task for host managed_node3 30575 1726867656.75887: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867656.75892: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867656.75905: getting variables 30575 1726867656.75907: in VariableManager get_vars() 30575 1726867656.75947: Calling all_inventory to load vars for managed_node3 30575 1726867656.75950: Calling groups_inventory to load vars for managed_node3 30575 1726867656.75952: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867656.75962: Calling all_plugins_play to load vars for managed_node3 30575 1726867656.75964: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867656.75967: Calling groups_plugins_play to load vars for managed_node3 30575 1726867656.77398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867656.79052: done with get_vars() 30575 1726867656.79080: done getting variables 30575 1726867656.79150: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:36 -0400 (0:00:00.063) 0:01:32.169 ****** 30575 1726867656.79194: entering _queue_task() for managed_node3/debug 30575 1726867656.79557: worker is 1 (out of 1 available) 30575 1726867656.79570: exiting _queue_task() for managed_node3/debug 30575 1726867656.79691: done queuing things up, now waiting for results queue to drain 30575 1726867656.79694: waiting for pending results... 30575 1726867656.79999: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867656.80096: in run() - task 0affcac9-a3a5-e081-a588-000000001b4e 30575 1726867656.80100: variable 'ansible_search_path' from source: unknown 30575 1726867656.80103: variable 'ansible_search_path' from source: unknown 30575 1726867656.80130: calling self._execute() 30575 1726867656.80237: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.80312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.80320: variable 'omit' from source: magic vars 30575 1726867656.80685: variable 'ansible_distribution_major_version' from source: facts 30575 1726867656.80702: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867656.80712: variable 'omit' from source: magic vars 30575 1726867656.80786: variable 'omit' from source: magic vars 30575 1726867656.80826: variable 'omit' from source: magic vars 30575 1726867656.80880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867656.80921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867656.80948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867656.81079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.81082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.81085: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867656.81087: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.81089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.81151: Set connection var ansible_pipelining to False 30575 1726867656.81159: Set connection var ansible_shell_type to sh 30575 1726867656.81170: Set connection var ansible_shell_executable to /bin/sh 30575 1726867656.81187: Set connection var ansible_timeout to 10 30575 1726867656.81198: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867656.81210: Set connection var ansible_connection to ssh 30575 1726867656.81240: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.81249: variable 'ansible_connection' from source: unknown 30575 1726867656.81256: variable 'ansible_module_compression' from source: unknown 30575 1726867656.81265: variable 'ansible_shell_type' from source: unknown 30575 1726867656.81272: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.81281: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.81297: variable 'ansible_pipelining' from source: unknown 30575 1726867656.81305: variable 'ansible_timeout' from source: unknown 30575 1726867656.81316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.81462: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867656.81482: variable 'omit' from source: magic vars 30575 1726867656.81493: starting attempt loop 30575 1726867656.81511: running the handler 30575 1726867656.81582: variable '__network_connections_result' from source: set_fact 30575 1726867656.81655: variable '__network_connections_result' from source: set_fact 30575 1726867656.81774: handler run complete 30575 1726867656.81806: attempt loop complete, returning result 30575 1726867656.81840: _execute() done 30575 1726867656.81844: dumping result to json 30575 1726867656.81846: done dumping result, returning 30575 1726867656.81848: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-000000001b4e] 30575 1726867656.81856: sending task result for task 0affcac9-a3a5-e081-a588-000000001b4e ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30575 1726867656.82180: no more pending results, returning what we have 30575 1726867656.82184: results queue empty 30575 1726867656.82185: checking for any_errors_fatal 30575 1726867656.82193: done checking for any_errors_fatal 30575 1726867656.82194: checking for max_fail_percentage 30575 1726867656.82196: done checking for max_fail_percentage 30575 1726867656.82197: checking to see if all hosts have failed and the running result is not ok 30575 1726867656.82198: done checking to see if all hosts have failed 30575 1726867656.82199: getting the remaining hosts for this loop 30575 1726867656.82201: done getting the remaining hosts for this loop 30575 1726867656.82204: getting the next task for host managed_node3 30575 1726867656.82217: done getting next task for host managed_node3 30575 1726867656.82221: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867656.82227: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867656.82243: getting variables 30575 1726867656.82245: in VariableManager get_vars() 30575 1726867656.82399: Calling all_inventory to load vars for managed_node3 30575 1726867656.82402: Calling groups_inventory to load vars for managed_node3 30575 1726867656.82404: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867656.82493: Calling all_plugins_play to load vars for managed_node3 30575 1726867656.82497: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867656.82502: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b4e 30575 1726867656.82505: WORKER PROCESS EXITING 30575 1726867656.82509: Calling groups_plugins_play to load vars for managed_node3 30575 1726867656.84127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867656.85768: done with get_vars() 30575 1726867656.85797: done getting variables 30575 1726867656.85866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:36 -0400 (0:00:00.067) 0:01:32.236 ****** 30575 1726867656.85906: entering _queue_task() for managed_node3/debug 30575 1726867656.86493: worker is 1 (out of 1 available) 30575 1726867656.86503: exiting _queue_task() for managed_node3/debug 30575 1726867656.86516: done queuing things up, now waiting for results queue to drain 30575 1726867656.86518: waiting for pending results... 30575 1726867656.86652: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867656.86808: in run() - task 0affcac9-a3a5-e081-a588-000000001b4f 30575 1726867656.86832: variable 'ansible_search_path' from source: unknown 30575 1726867656.86841: variable 'ansible_search_path' from source: unknown 30575 1726867656.86895: calling self._execute() 30575 1726867656.87005: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.87021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.87038: variable 'omit' from source: magic vars 30575 1726867656.87457: variable 'ansible_distribution_major_version' from source: facts 30575 1726867656.87476: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867656.87623: variable 'network_state' from source: role '' defaults 30575 1726867656.87640: Evaluated conditional (network_state != {}): False 30575 1726867656.87649: when evaluation is False, skipping this task 30575 1726867656.87657: _execute() done 30575 1726867656.87666: dumping result to json 30575 1726867656.87675: done dumping result, returning 30575 1726867656.87691: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000001b4f] 30575 1726867656.87701: sending task result for task 0affcac9-a3a5-e081-a588-000000001b4f skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867656.87873: no more pending results, returning what we have 30575 1726867656.87880: results queue empty 30575 1726867656.87881: checking for any_errors_fatal 30575 1726867656.87892: done checking for any_errors_fatal 30575 1726867656.87893: checking for max_fail_percentage 30575 1726867656.87895: done checking for max_fail_percentage 30575 1726867656.87896: checking to see if all hosts have failed and the running result is not ok 30575 1726867656.87898: done checking to see if all hosts have failed 30575 1726867656.87898: getting the remaining hosts for this loop 30575 1726867656.87900: done getting the remaining hosts for this loop 30575 1726867656.87904: getting the next task for host managed_node3 30575 1726867656.87912: done getting next task for host managed_node3 30575 1726867656.87919: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867656.87925: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867656.87955: getting variables 30575 1726867656.87958: in VariableManager get_vars() 30575 1726867656.88107: Calling all_inventory to load vars for managed_node3 30575 1726867656.88110: Calling groups_inventory to load vars for managed_node3 30575 1726867656.88113: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867656.88201: Calling all_plugins_play to load vars for managed_node3 30575 1726867656.88205: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867656.88208: Calling groups_plugins_play to load vars for managed_node3 30575 1726867656.88923: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b4f 30575 1726867656.88926: WORKER PROCESS EXITING 30575 1726867656.89727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867656.91390: done with get_vars() 30575 1726867656.91411: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:36 -0400 (0:00:00.056) 0:01:32.292 ****** 30575 1726867656.91512: entering _queue_task() for managed_node3/ping 30575 1726867656.91917: worker is 1 (out of 1 available) 30575 1726867656.91928: exiting _queue_task() for managed_node3/ping 30575 1726867656.91939: done queuing things up, now waiting for results queue to drain 30575 1726867656.91940: waiting for pending results... 30575 1726867656.92151: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867656.92321: in run() - task 0affcac9-a3a5-e081-a588-000000001b50 30575 1726867656.92343: variable 'ansible_search_path' from source: unknown 30575 1726867656.92351: variable 'ansible_search_path' from source: unknown 30575 1726867656.92396: calling self._execute() 30575 1726867656.92505: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.92521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.92536: variable 'omit' from source: magic vars 30575 1726867656.92949: variable 'ansible_distribution_major_version' from source: facts 30575 1726867656.92966: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867656.92982: variable 'omit' from source: magic vars 30575 1726867656.93052: variable 'omit' from source: magic vars 30575 1726867656.93095: variable 'omit' from source: magic vars 30575 1726867656.93146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867656.93186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867656.93213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867656.93237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.93261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867656.93297: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867656.93311: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.93323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.93433: Set connection var ansible_pipelining to False 30575 1726867656.93441: Set connection var ansible_shell_type to sh 30575 1726867656.93451: Set connection var ansible_shell_executable to /bin/sh 30575 1726867656.93460: Set connection var ansible_timeout to 10 30575 1726867656.93474: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867656.93524: Set connection var ansible_connection to ssh 30575 1726867656.93528: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.93530: variable 'ansible_connection' from source: unknown 30575 1726867656.93533: variable 'ansible_module_compression' from source: unknown 30575 1726867656.93536: variable 'ansible_shell_type' from source: unknown 30575 1726867656.93544: variable 'ansible_shell_executable' from source: unknown 30575 1726867656.93551: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867656.93558: variable 'ansible_pipelining' from source: unknown 30575 1726867656.93565: variable 'ansible_timeout' from source: unknown 30575 1726867656.93578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867656.93850: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867656.93855: variable 'omit' from source: magic vars 30575 1726867656.93857: starting attempt loop 30575 1726867656.93860: running the handler 30575 1726867656.93862: _low_level_execute_command(): starting 30575 1726867656.93864: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867656.94598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867656.94624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.94736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.94753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867656.94770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867656.94793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.95060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867656.96688: stdout chunk (state=3): >>>/root <<< 30575 1726867656.96845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867656.96856: stdout chunk (state=3): >>><<< 30575 1726867656.96870: stderr chunk (state=3): >>><<< 30575 1726867656.96928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867656.97176: _low_level_execute_command(): starting 30575 1726867656.97182: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556 `" && echo ansible-tmp-1726867656.9708443-34995-244158154722556="` echo /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556 `" ) && sleep 0' 30575 1726867656.98291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867656.98300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.98311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.98325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867656.98337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867656.98357: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867656.98360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.98367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867656.98378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867656.98386: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867656.98394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867656.98404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867656.98419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867656.98423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867656.98491: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867656.98789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867656.98792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867657.00667: stdout chunk (state=3): >>>ansible-tmp-1726867656.9708443-34995-244158154722556=/root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556 <<< 30575 1726867657.00806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867657.00809: stdout chunk (state=3): >>><<< 30575 1726867657.00819: stderr chunk (state=3): >>><<< 30575 1726867657.00832: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867656.9708443-34995-244158154722556=/root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867657.00881: variable 'ansible_module_compression' from source: unknown 30575 1726867657.00921: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867657.00951: variable 'ansible_facts' from source: unknown 30575 1726867657.01383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/AnsiballZ_ping.py 30575 1726867657.01588: Sending initial data 30575 1726867657.01591: Sent initial data (153 bytes) 30575 1726867657.02882: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867657.02893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867657.02902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867657.02969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867657.04500: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867657.04505: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867657.04602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867657.04621: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/AnsiballZ_ping.py" <<< 30575 1726867657.04624: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpqbyiih0n /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/AnsiballZ_ping.py <<< 30575 1726867657.04664: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpqbyiih0n" to remote "/root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/AnsiballZ_ping.py" <<< 30575 1726867657.05892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867657.05997: stderr chunk (state=3): >>><<< 30575 1726867657.06001: stdout chunk (state=3): >>><<< 30575 1726867657.06023: done transferring module to remote 30575 1726867657.06083: _low_level_execute_command(): starting 30575 1726867657.06087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/ /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/AnsiballZ_ping.py && sleep 0' 30575 1726867657.07215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867657.07484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867657.07487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867657.09318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867657.09322: stdout chunk (state=3): >>><<< 30575 1726867657.09324: stderr chunk (state=3): >>><<< 30575 1726867657.09326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867657.09329: _low_level_execute_command(): starting 30575 1726867657.09332: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/AnsiballZ_ping.py && sleep 0' 30575 1726867657.10235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867657.10238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867657.10240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867657.10243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867657.10248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867657.10255: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867657.10265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867657.10281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867657.10290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867657.10297: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867657.10305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867657.10315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867657.10331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867657.10340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867657.10391: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867657.10425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867657.10440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867657.10456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867657.10536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867657.25415: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867657.26605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867657.26609: stdout chunk (state=3): >>><<< 30575 1726867657.26611: stderr chunk (state=3): >>><<< 30575 1726867657.26884: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867657.26889: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867657.26892: _low_level_execute_command(): starting 30575 1726867657.26894: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867656.9708443-34995-244158154722556/ > /dev/null 2>&1 && sleep 0' 30575 1726867657.27879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867657.28010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867657.28031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867657.28112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867657.30030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867657.30033: stdout chunk (state=3): >>><<< 30575 1726867657.30039: stderr chunk (state=3): >>><<< 30575 1726867657.30056: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867657.30063: handler run complete 30575 1726867657.30080: attempt loop complete, returning result 30575 1726867657.30083: _execute() done 30575 1726867657.30086: dumping result to json 30575 1726867657.30088: done dumping result, returning 30575 1726867657.30099: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-000000001b50] 30575 1726867657.30104: sending task result for task 0affcac9-a3a5-e081-a588-000000001b50 30575 1726867657.30207: done sending task result for task 0affcac9-a3a5-e081-a588-000000001b50 30575 1726867657.30210: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867657.30290: no more pending results, returning what we have 30575 1726867657.30294: results queue empty 30575 1726867657.30295: checking for any_errors_fatal 30575 1726867657.30303: done checking for any_errors_fatal 30575 1726867657.30304: checking for max_fail_percentage 30575 1726867657.30306: done checking for max_fail_percentage 30575 1726867657.30307: checking to see if all hosts have failed and the running result is not ok 30575 1726867657.30308: done checking to see if all hosts have failed 30575 1726867657.30308: getting the remaining hosts for this loop 30575 1726867657.30310: done getting the remaining hosts for this loop 30575 1726867657.30314: getting the next task for host managed_node3 30575 1726867657.30327: done getting next task for host managed_node3 30575 1726867657.30330: ^ task is: TASK: meta (role_complete) 30575 1726867657.30336: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867657.30351: getting variables 30575 1726867657.30353: in VariableManager get_vars() 30575 1726867657.30606: Calling all_inventory to load vars for managed_node3 30575 1726867657.30609: Calling groups_inventory to load vars for managed_node3 30575 1726867657.30611: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.30621: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.30624: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.30626: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.33826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.37295: done with get_vars() 30575 1726867657.37324: done getting variables 30575 1726867657.37534: done queuing things up, now waiting for results queue to drain 30575 1726867657.37537: results queue empty 30575 1726867657.37537: checking for any_errors_fatal 30575 1726867657.37540: done checking for any_errors_fatal 30575 1726867657.37541: checking for max_fail_percentage 30575 1726867657.37542: done checking for max_fail_percentage 30575 1726867657.37543: checking to see if all hosts have failed and the running result is not ok 30575 1726867657.37544: done checking to see if all hosts have failed 30575 1726867657.37544: getting the remaining hosts for this loop 30575 1726867657.37545: done getting the remaining hosts for this loop 30575 1726867657.37548: getting the next task for host managed_node3 30575 1726867657.37554: done getting next task for host managed_node3 30575 1726867657.37556: ^ task is: TASK: Test 30575 1726867657.37558: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867657.37560: getting variables 30575 1726867657.37561: in VariableManager get_vars() 30575 1726867657.37573: Calling all_inventory to load vars for managed_node3 30575 1726867657.37575: Calling groups_inventory to load vars for managed_node3 30575 1726867657.37680: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.37690: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.37693: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.37696: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.39861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.41645: done with get_vars() 30575 1726867657.41666: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:27:37 -0400 (0:00:00.502) 0:01:32.794 ****** 30575 1726867657.41741: entering _queue_task() for managed_node3/include_tasks 30575 1726867657.42151: worker is 1 (out of 1 available) 30575 1726867657.42164: exiting _queue_task() for managed_node3/include_tasks 30575 1726867657.42180: done queuing things up, now waiting for results queue to drain 30575 1726867657.42182: waiting for pending results... 30575 1726867657.42711: running TaskExecutor() for managed_node3/TASK: Test 30575 1726867657.42716: in run() - task 0affcac9-a3a5-e081-a588-000000001748 30575 1726867657.42719: variable 'ansible_search_path' from source: unknown 30575 1726867657.42721: variable 'ansible_search_path' from source: unknown 30575 1726867657.42723: variable 'lsr_test' from source: include params 30575 1726867657.42967: variable 'lsr_test' from source: include params 30575 1726867657.43048: variable 'omit' from source: magic vars 30575 1726867657.43199: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867657.43213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867657.43227: variable 'omit' from source: magic vars 30575 1726867657.43491: variable 'ansible_distribution_major_version' from source: facts 30575 1726867657.43506: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867657.43517: variable 'item' from source: unknown 30575 1726867657.43592: variable 'item' from source: unknown 30575 1726867657.43627: variable 'item' from source: unknown 30575 1726867657.43701: variable 'item' from source: unknown 30575 1726867657.43995: dumping result to json 30575 1726867657.43998: done dumping result, returning 30575 1726867657.44001: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-e081-a588-000000001748] 30575 1726867657.44003: sending task result for task 0affcac9-a3a5-e081-a588-000000001748 30575 1726867657.44207: no more pending results, returning what we have 30575 1726867657.44213: in VariableManager get_vars() 30575 1726867657.44254: Calling all_inventory to load vars for managed_node3 30575 1726867657.44257: Calling groups_inventory to load vars for managed_node3 30575 1726867657.44261: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.44273: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.44279: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.44283: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.44992: done sending task result for task 0affcac9-a3a5-e081-a588-000000001748 30575 1726867657.44996: WORKER PROCESS EXITING 30575 1726867657.45807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.47436: done with get_vars() 30575 1726867657.47454: variable 'ansible_search_path' from source: unknown 30575 1726867657.47456: variable 'ansible_search_path' from source: unknown 30575 1726867657.47495: we have included files to process 30575 1726867657.47496: generating all_blocks data 30575 1726867657.47498: done generating all_blocks data 30575 1726867657.47505: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867657.47506: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867657.47508: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867657.47701: done processing included file 30575 1726867657.47703: iterating over new_blocks loaded from include file 30575 1726867657.47705: in VariableManager get_vars() 30575 1726867657.47720: done with get_vars() 30575 1726867657.47722: filtering new block on tags 30575 1726867657.47756: done filtering new block on tags 30575 1726867657.47759: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node3 => (item=tasks/remove+down_profile.yml) 30575 1726867657.47764: extending task lists for all hosts with included blocks 30575 1726867657.48802: done extending task lists 30575 1726867657.48803: done processing included files 30575 1726867657.48804: results queue empty 30575 1726867657.48805: checking for any_errors_fatal 30575 1726867657.48807: done checking for any_errors_fatal 30575 1726867657.48807: checking for max_fail_percentage 30575 1726867657.48808: done checking for max_fail_percentage 30575 1726867657.48809: checking to see if all hosts have failed and the running result is not ok 30575 1726867657.48810: done checking to see if all hosts have failed 30575 1726867657.48811: getting the remaining hosts for this loop 30575 1726867657.48812: done getting the remaining hosts for this loop 30575 1726867657.48814: getting the next task for host managed_node3 30575 1726867657.48819: done getting next task for host managed_node3 30575 1726867657.48821: ^ task is: TASK: Include network role 30575 1726867657.48824: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867657.48827: getting variables 30575 1726867657.48828: in VariableManager get_vars() 30575 1726867657.48847: Calling all_inventory to load vars for managed_node3 30575 1726867657.48850: Calling groups_inventory to load vars for managed_node3 30575 1726867657.48852: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.48858: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.48862: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.48864: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.50412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.52528: done with get_vars() 30575 1726867657.52551: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 17:27:37 -0400 (0:00:00.109) 0:01:32.904 ****** 30575 1726867657.52656: entering _queue_task() for managed_node3/include_role 30575 1726867657.53074: worker is 1 (out of 1 available) 30575 1726867657.53091: exiting _queue_task() for managed_node3/include_role 30575 1726867657.53112: done queuing things up, now waiting for results queue to drain 30575 1726867657.53114: waiting for pending results... 30575 1726867657.53530: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867657.53711: in run() - task 0affcac9-a3a5-e081-a588-000000001ca9 30575 1726867657.53728: variable 'ansible_search_path' from source: unknown 30575 1726867657.53731: variable 'ansible_search_path' from source: unknown 30575 1726867657.53787: calling self._execute() 30575 1726867657.54034: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867657.54037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867657.54041: variable 'omit' from source: magic vars 30575 1726867657.54700: variable 'ansible_distribution_major_version' from source: facts 30575 1726867657.54710: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867657.54716: _execute() done 30575 1726867657.54724: dumping result to json 30575 1726867657.54727: done dumping result, returning 30575 1726867657.54734: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-000000001ca9] 30575 1726867657.54740: sending task result for task 0affcac9-a3a5-e081-a588-000000001ca9 30575 1726867657.54884: no more pending results, returning what we have 30575 1726867657.54891: in VariableManager get_vars() 30575 1726867657.54940: Calling all_inventory to load vars for managed_node3 30575 1726867657.54942: Calling groups_inventory to load vars for managed_node3 30575 1726867657.54945: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.54958: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.54960: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.54963: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.55593: done sending task result for task 0affcac9-a3a5-e081-a588-000000001ca9 30575 1726867657.55596: WORKER PROCESS EXITING 30575 1726867657.56626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.58363: done with get_vars() 30575 1726867657.58388: variable 'ansible_search_path' from source: unknown 30575 1726867657.58390: variable 'ansible_search_path' from source: unknown 30575 1726867657.58548: variable 'omit' from source: magic vars 30575 1726867657.58594: variable 'omit' from source: magic vars 30575 1726867657.58610: variable 'omit' from source: magic vars 30575 1726867657.58614: we have included files to process 30575 1726867657.58615: generating all_blocks data 30575 1726867657.58616: done generating all_blocks data 30575 1726867657.58618: processing included file: fedora.linux_system_roles.network 30575 1726867657.58637: in VariableManager get_vars() 30575 1726867657.58659: done with get_vars() 30575 1726867657.58688: in VariableManager get_vars() 30575 1726867657.58707: done with get_vars() 30575 1726867657.58746: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867657.58878: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867657.58957: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867657.59496: in VariableManager get_vars() 30575 1726867657.59521: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867657.63400: iterating over new_blocks loaded from include file 30575 1726867657.63403: in VariableManager get_vars() 30575 1726867657.63426: done with get_vars() 30575 1726867657.63428: filtering new block on tags 30575 1726867657.63717: done filtering new block on tags 30575 1726867657.63721: in VariableManager get_vars() 30575 1726867657.63737: done with get_vars() 30575 1726867657.63739: filtering new block on tags 30575 1726867657.63760: done filtering new block on tags 30575 1726867657.63762: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867657.63768: extending task lists for all hosts with included blocks 30575 1726867657.63880: done extending task lists 30575 1726867657.63881: done processing included files 30575 1726867657.63882: results queue empty 30575 1726867657.63883: checking for any_errors_fatal 30575 1726867657.63887: done checking for any_errors_fatal 30575 1726867657.63888: checking for max_fail_percentage 30575 1726867657.63889: done checking for max_fail_percentage 30575 1726867657.63890: checking to see if all hosts have failed and the running result is not ok 30575 1726867657.63891: done checking to see if all hosts have failed 30575 1726867657.63891: getting the remaining hosts for this loop 30575 1726867657.63893: done getting the remaining hosts for this loop 30575 1726867657.63895: getting the next task for host managed_node3 30575 1726867657.63900: done getting next task for host managed_node3 30575 1726867657.63902: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867657.63906: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867657.63916: getting variables 30575 1726867657.63917: in VariableManager get_vars() 30575 1726867657.63930: Calling all_inventory to load vars for managed_node3 30575 1726867657.63932: Calling groups_inventory to load vars for managed_node3 30575 1726867657.63934: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.63940: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.63942: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.63945: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.65332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.67080: done with get_vars() 30575 1726867657.67105: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:37 -0400 (0:00:00.145) 0:01:33.049 ****** 30575 1726867657.67191: entering _queue_task() for managed_node3/include_tasks 30575 1726867657.67680: worker is 1 (out of 1 available) 30575 1726867657.67693: exiting _queue_task() for managed_node3/include_tasks 30575 1726867657.67706: done queuing things up, now waiting for results queue to drain 30575 1726867657.67707: waiting for pending results... 30575 1726867657.68040: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867657.68194: in run() - task 0affcac9-a3a5-e081-a588-000000001d2b 30575 1726867657.68213: variable 'ansible_search_path' from source: unknown 30575 1726867657.68245: variable 'ansible_search_path' from source: unknown 30575 1726867657.68351: calling self._execute() 30575 1726867657.68584: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867657.68604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867657.68619: variable 'omit' from source: magic vars 30575 1726867657.69610: variable 'ansible_distribution_major_version' from source: facts 30575 1726867657.69620: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867657.69630: _execute() done 30575 1726867657.69719: dumping result to json 30575 1726867657.69722: done dumping result, returning 30575 1726867657.69725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-000000001d2b] 30575 1726867657.69726: sending task result for task 0affcac9-a3a5-e081-a588-000000001d2b 30575 1726867657.69875: no more pending results, returning what we have 30575 1726867657.69882: in VariableManager get_vars() 30575 1726867657.69934: Calling all_inventory to load vars for managed_node3 30575 1726867657.69936: Calling groups_inventory to load vars for managed_node3 30575 1726867657.69938: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.69949: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.69952: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.69954: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.70767: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d2b 30575 1726867657.70771: WORKER PROCESS EXITING 30575 1726867657.73556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.76549: done with get_vars() 30575 1726867657.76573: variable 'ansible_search_path' from source: unknown 30575 1726867657.76574: variable 'ansible_search_path' from source: unknown 30575 1726867657.76647: we have included files to process 30575 1726867657.76648: generating all_blocks data 30575 1726867657.76650: done generating all_blocks data 30575 1726867657.76653: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867657.76654: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867657.76656: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867657.78100: done processing included file 30575 1726867657.78102: iterating over new_blocks loaded from include file 30575 1726867657.78104: in VariableManager get_vars() 30575 1726867657.78228: done with get_vars() 30575 1726867657.78233: filtering new block on tags 30575 1726867657.78273: done filtering new block on tags 30575 1726867657.78279: in VariableManager get_vars() 30575 1726867657.78386: done with get_vars() 30575 1726867657.78388: filtering new block on tags 30575 1726867657.78567: done filtering new block on tags 30575 1726867657.78571: in VariableManager get_vars() 30575 1726867657.78605: done with get_vars() 30575 1726867657.78607: filtering new block on tags 30575 1726867657.78802: done filtering new block on tags 30575 1726867657.78805: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867657.78811: extending task lists for all hosts with included blocks 30575 1726867657.82739: done extending task lists 30575 1726867657.82741: done processing included files 30575 1726867657.82746: results queue empty 30575 1726867657.82747: checking for any_errors_fatal 30575 1726867657.82750: done checking for any_errors_fatal 30575 1726867657.82750: checking for max_fail_percentage 30575 1726867657.82752: done checking for max_fail_percentage 30575 1726867657.82752: checking to see if all hosts have failed and the running result is not ok 30575 1726867657.82753: done checking to see if all hosts have failed 30575 1726867657.82754: getting the remaining hosts for this loop 30575 1726867657.82755: done getting the remaining hosts for this loop 30575 1726867657.82758: getting the next task for host managed_node3 30575 1726867657.82763: done getting next task for host managed_node3 30575 1726867657.82766: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867657.82769: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867657.82781: getting variables 30575 1726867657.82782: in VariableManager get_vars() 30575 1726867657.82797: Calling all_inventory to load vars for managed_node3 30575 1726867657.82799: Calling groups_inventory to load vars for managed_node3 30575 1726867657.82801: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.82806: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.82808: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.82810: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.84013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.87000: done with get_vars() 30575 1726867657.87027: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:37 -0400 (0:00:00.199) 0:01:33.248 ****** 30575 1726867657.87118: entering _queue_task() for managed_node3/setup 30575 1726867657.87506: worker is 1 (out of 1 available) 30575 1726867657.87518: exiting _queue_task() for managed_node3/setup 30575 1726867657.87531: done queuing things up, now waiting for results queue to drain 30575 1726867657.87534: waiting for pending results... 30575 1726867657.87874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867657.88030: in run() - task 0affcac9-a3a5-e081-a588-000000001d82 30575 1726867657.88043: variable 'ansible_search_path' from source: unknown 30575 1726867657.88048: variable 'ansible_search_path' from source: unknown 30575 1726867657.88096: calling self._execute() 30575 1726867657.88213: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867657.88224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867657.88230: variable 'omit' from source: magic vars 30575 1726867657.89082: variable 'ansible_distribution_major_version' from source: facts 30575 1726867657.89086: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867657.89497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867657.91472: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867657.91535: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867657.91563: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867657.91615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867657.91638: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867657.91693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867657.91715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867657.91735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867657.91761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867657.91772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867657.91836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867657.91859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867657.91876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867657.91904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867657.91918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867657.92056: variable '__network_required_facts' from source: role '' defaults 30575 1726867657.92065: variable 'ansible_facts' from source: unknown 30575 1726867657.93123: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867657.93129: when evaluation is False, skipping this task 30575 1726867657.93131: _execute() done 30575 1726867657.93134: dumping result to json 30575 1726867657.93136: done dumping result, returning 30575 1726867657.93138: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000001d82] 30575 1726867657.93143: sending task result for task 0affcac9-a3a5-e081-a588-000000001d82 30575 1726867657.93368: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d82 30575 1726867657.93373: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867657.93430: no more pending results, returning what we have 30575 1726867657.93434: results queue empty 30575 1726867657.93435: checking for any_errors_fatal 30575 1726867657.93436: done checking for any_errors_fatal 30575 1726867657.93437: checking for max_fail_percentage 30575 1726867657.93439: done checking for max_fail_percentage 30575 1726867657.93440: checking to see if all hosts have failed and the running result is not ok 30575 1726867657.93440: done checking to see if all hosts have failed 30575 1726867657.93441: getting the remaining hosts for this loop 30575 1726867657.93442: done getting the remaining hosts for this loop 30575 1726867657.93446: getting the next task for host managed_node3 30575 1726867657.93457: done getting next task for host managed_node3 30575 1726867657.93462: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867657.93469: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867657.93498: getting variables 30575 1726867657.93503: in VariableManager get_vars() 30575 1726867657.93552: Calling all_inventory to load vars for managed_node3 30575 1726867657.93555: Calling groups_inventory to load vars for managed_node3 30575 1726867657.93557: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867657.93567: Calling all_plugins_play to load vars for managed_node3 30575 1726867657.93574: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867657.93835: Calling groups_plugins_play to load vars for managed_node3 30575 1726867657.95474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867657.97043: done with get_vars() 30575 1726867657.97064: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:37 -0400 (0:00:00.100) 0:01:33.348 ****** 30575 1726867657.97141: entering _queue_task() for managed_node3/stat 30575 1726867657.97386: worker is 1 (out of 1 available) 30575 1726867657.97401: exiting _queue_task() for managed_node3/stat 30575 1726867657.97414: done queuing things up, now waiting for results queue to drain 30575 1726867657.97418: waiting for pending results... 30575 1726867657.97629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867657.97886: in run() - task 0affcac9-a3a5-e081-a588-000000001d84 30575 1726867657.97890: variable 'ansible_search_path' from source: unknown 30575 1726867657.97893: variable 'ansible_search_path' from source: unknown 30575 1726867657.97896: calling self._execute() 30575 1726867657.97920: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867657.97925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867657.97932: variable 'omit' from source: magic vars 30575 1726867657.98421: variable 'ansible_distribution_major_version' from source: facts 30575 1726867657.98433: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867657.98796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867657.99401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867657.99404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867657.99406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867657.99431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867657.99555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867657.99601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867657.99632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867657.99674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867657.99885: variable '__network_is_ostree' from source: set_fact 30575 1726867657.99888: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867657.99894: when evaluation is False, skipping this task 30575 1726867657.99897: _execute() done 30575 1726867657.99899: dumping result to json 30575 1726867657.99904: done dumping result, returning 30575 1726867657.99908: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000001d84] 30575 1726867657.99910: sending task result for task 0affcac9-a3a5-e081-a588-000000001d84 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867658.00204: no more pending results, returning what we have 30575 1726867658.00207: results queue empty 30575 1726867658.00208: checking for any_errors_fatal 30575 1726867658.00214: done checking for any_errors_fatal 30575 1726867658.00217: checking for max_fail_percentage 30575 1726867658.00219: done checking for max_fail_percentage 30575 1726867658.00219: checking to see if all hosts have failed and the running result is not ok 30575 1726867658.00220: done checking to see if all hosts have failed 30575 1726867658.00221: getting the remaining hosts for this loop 30575 1726867658.00222: done getting the remaining hosts for this loop 30575 1726867658.00225: getting the next task for host managed_node3 30575 1726867658.00232: done getting next task for host managed_node3 30575 1726867658.00236: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867658.00241: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867658.00318: getting variables 30575 1726867658.00319: in VariableManager get_vars() 30575 1726867658.00356: Calling all_inventory to load vars for managed_node3 30575 1726867658.00358: Calling groups_inventory to load vars for managed_node3 30575 1726867658.00361: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867658.00506: Calling all_plugins_play to load vars for managed_node3 30575 1726867658.00510: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867658.00514: Calling groups_plugins_play to load vars for managed_node3 30575 1726867658.01102: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d84 30575 1726867658.01106: WORKER PROCESS EXITING 30575 1726867658.02730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867658.04688: done with get_vars() 30575 1726867658.04717: done getting variables 30575 1726867658.04836: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:38 -0400 (0:00:00.077) 0:01:33.426 ****** 30575 1726867658.04913: entering _queue_task() for managed_node3/set_fact 30575 1726867658.05398: worker is 1 (out of 1 available) 30575 1726867658.05413: exiting _queue_task() for managed_node3/set_fact 30575 1726867658.05431: done queuing things up, now waiting for results queue to drain 30575 1726867658.05433: waiting for pending results... 30575 1726867658.05769: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867658.06169: in run() - task 0affcac9-a3a5-e081-a588-000000001d85 30575 1726867658.06250: variable 'ansible_search_path' from source: unknown 30575 1726867658.06254: variable 'ansible_search_path' from source: unknown 30575 1726867658.06257: calling self._execute() 30575 1726867658.06439: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867658.06498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867658.06518: variable 'omit' from source: magic vars 30575 1726867658.07024: variable 'ansible_distribution_major_version' from source: facts 30575 1726867658.07126: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867658.07247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867658.07639: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867658.07705: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867658.07805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867658.07843: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867658.07952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867658.07985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867658.08033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867658.08129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867658.08250: variable '__network_is_ostree' from source: set_fact 30575 1726867658.08261: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867658.08268: when evaluation is False, skipping this task 30575 1726867658.08275: _execute() done 30575 1726867658.08332: dumping result to json 30575 1726867658.08335: done dumping result, returning 30575 1726867658.08338: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000001d85] 30575 1726867658.08340: sending task result for task 0affcac9-a3a5-e081-a588-000000001d85 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867658.08486: no more pending results, returning what we have 30575 1726867658.08491: results queue empty 30575 1726867658.08492: checking for any_errors_fatal 30575 1726867658.08499: done checking for any_errors_fatal 30575 1726867658.08500: checking for max_fail_percentage 30575 1726867658.08502: done checking for max_fail_percentage 30575 1726867658.08503: checking to see if all hosts have failed and the running result is not ok 30575 1726867658.08504: done checking to see if all hosts have failed 30575 1726867658.08505: getting the remaining hosts for this loop 30575 1726867658.08506: done getting the remaining hosts for this loop 30575 1726867658.08510: getting the next task for host managed_node3 30575 1726867658.08525: done getting next task for host managed_node3 30575 1726867658.08529: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867658.08535: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867658.08708: getting variables 30575 1726867658.08710: in VariableManager get_vars() 30575 1726867658.08754: Calling all_inventory to load vars for managed_node3 30575 1726867658.08757: Calling groups_inventory to load vars for managed_node3 30575 1726867658.08759: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867658.08769: Calling all_plugins_play to load vars for managed_node3 30575 1726867658.08772: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867658.08775: Calling groups_plugins_play to load vars for managed_node3 30575 1726867658.09510: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d85 30575 1726867658.09513: WORKER PROCESS EXITING 30575 1726867658.10434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867658.12151: done with get_vars() 30575 1726867658.12173: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:38 -0400 (0:00:00.073) 0:01:33.500 ****** 30575 1726867658.12283: entering _queue_task() for managed_node3/service_facts 30575 1726867658.12794: worker is 1 (out of 1 available) 30575 1726867658.12805: exiting _queue_task() for managed_node3/service_facts 30575 1726867658.12818: done queuing things up, now waiting for results queue to drain 30575 1726867658.12819: waiting for pending results... 30575 1726867658.12917: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867658.13089: in run() - task 0affcac9-a3a5-e081-a588-000000001d87 30575 1726867658.13108: variable 'ansible_search_path' from source: unknown 30575 1726867658.13115: variable 'ansible_search_path' from source: unknown 30575 1726867658.13165: calling self._execute() 30575 1726867658.13268: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867658.13285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867658.13300: variable 'omit' from source: magic vars 30575 1726867658.13690: variable 'ansible_distribution_major_version' from source: facts 30575 1726867658.13714: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867658.13729: variable 'omit' from source: magic vars 30575 1726867658.13821: variable 'omit' from source: magic vars 30575 1726867658.13855: variable 'omit' from source: magic vars 30575 1726867658.13899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867658.13951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867658.13979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867658.14002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867658.14027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867658.14067: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867658.14106: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867658.14109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867658.14187: Set connection var ansible_pipelining to False 30575 1726867658.14190: Set connection var ansible_shell_type to sh 30575 1726867658.14195: Set connection var ansible_shell_executable to /bin/sh 30575 1726867658.14200: Set connection var ansible_timeout to 10 30575 1726867658.14205: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867658.14213: Set connection var ansible_connection to ssh 30575 1726867658.14232: variable 'ansible_shell_executable' from source: unknown 30575 1726867658.14235: variable 'ansible_connection' from source: unknown 30575 1726867658.14238: variable 'ansible_module_compression' from source: unknown 30575 1726867658.14249: variable 'ansible_shell_type' from source: unknown 30575 1726867658.14252: variable 'ansible_shell_executable' from source: unknown 30575 1726867658.14254: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867658.14256: variable 'ansible_pipelining' from source: unknown 30575 1726867658.14259: variable 'ansible_timeout' from source: unknown 30575 1726867658.14261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867658.14411: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867658.14421: variable 'omit' from source: magic vars 30575 1726867658.14425: starting attempt loop 30575 1726867658.14429: running the handler 30575 1726867658.14439: _low_level_execute_command(): starting 30575 1726867658.14447: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867658.14955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867658.14960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867658.14964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867658.15011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867658.15015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867658.15019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867658.15073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867658.16753: stdout chunk (state=3): >>>/root <<< 30575 1726867658.16912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867658.16918: stdout chunk (state=3): >>><<< 30575 1726867658.16920: stderr chunk (state=3): >>><<< 30575 1726867658.17024: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867658.17030: _low_level_execute_command(): starting 30575 1726867658.17032: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553 `" && echo ansible-tmp-1726867658.1693997-35034-13393300183553="` echo /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553 `" ) && sleep 0' 30575 1726867658.17537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867658.17540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867658.17543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867658.17552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867658.17612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867658.17618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867658.17620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867658.17667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867658.19545: stdout chunk (state=3): >>>ansible-tmp-1726867658.1693997-35034-13393300183553=/root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553 <<< 30575 1726867658.19680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867658.19702: stderr chunk (state=3): >>><<< 30575 1726867658.19705: stdout chunk (state=3): >>><<< 30575 1726867658.19723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867658.1693997-35034-13393300183553=/root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867658.19773: variable 'ansible_module_compression' from source: unknown 30575 1726867658.19808: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867658.19837: variable 'ansible_facts' from source: unknown 30575 1726867658.19896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/AnsiballZ_service_facts.py 30575 1726867658.19995: Sending initial data 30575 1726867658.19999: Sent initial data (161 bytes) 30575 1726867658.20430: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867658.20433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867658.20436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867658.20438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867658.20440: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867658.20483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867658.20491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867658.20536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867658.22085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867658.22089: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867658.22128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867658.22169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpz5encthx /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/AnsiballZ_service_facts.py <<< 30575 1726867658.22176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/AnsiballZ_service_facts.py" <<< 30575 1726867658.22220: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpz5encthx" to remote "/root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/AnsiballZ_service_facts.py" <<< 30575 1726867658.22223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/AnsiballZ_service_facts.py" <<< 30575 1726867658.23187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867658.23190: stdout chunk (state=3): >>><<< 30575 1726867658.23193: stderr chunk (state=3): >>><<< 30575 1726867658.23392: done transferring module to remote 30575 1726867658.23403: _low_level_execute_command(): starting 30575 1726867658.23406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/ /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/AnsiballZ_service_facts.py && sleep 0' 30575 1726867658.24202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867658.24212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867658.24222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867658.24236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867658.24249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867658.24261: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867658.24272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867658.24294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867658.24297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867658.24305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867658.24314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867658.24374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867658.24417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867658.24425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867658.24432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867658.24523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867658.26291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867658.26294: stdout chunk (state=3): >>><<< 30575 1726867658.26297: stderr chunk (state=3): >>><<< 30575 1726867658.26311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867658.26392: _low_level_execute_command(): starting 30575 1726867658.26396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/AnsiballZ_service_facts.py && sleep 0' 30575 1726867658.26988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867658.27005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867658.27033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867658.27055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867658.27129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867659.78311: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30575 1726867659.78326: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 30575 1726867659.78382: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 30575 1726867659.78398: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867659.79891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867659.79922: stderr chunk (state=3): >>><<< 30575 1726867659.79927: stdout chunk (state=3): >>><<< 30575 1726867659.79948: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867659.80722: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867659.80729: _low_level_execute_command(): starting 30575 1726867659.80735: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867658.1693997-35034-13393300183553/ > /dev/null 2>&1 && sleep 0' 30575 1726867659.81166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867659.81200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867659.81203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867659.81207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867659.81209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867659.81211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867659.81213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867659.81266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867659.81273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867659.81275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867659.81314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867659.83108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867659.83133: stderr chunk (state=3): >>><<< 30575 1726867659.83136: stdout chunk (state=3): >>><<< 30575 1726867659.83155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867659.83159: handler run complete 30575 1726867659.83265: variable 'ansible_facts' from source: unknown 30575 1726867659.83351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867659.83626: variable 'ansible_facts' from source: unknown 30575 1726867659.83704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867659.83821: attempt loop complete, returning result 30575 1726867659.83825: _execute() done 30575 1726867659.83827: dumping result to json 30575 1726867659.83861: done dumping result, returning 30575 1726867659.83868: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000001d87] 30575 1726867659.83873: sending task result for task 0affcac9-a3a5-e081-a588-000000001d87 30575 1726867659.84765: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d87 30575 1726867659.84768: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867659.84822: no more pending results, returning what we have 30575 1726867659.84824: results queue empty 30575 1726867659.84825: checking for any_errors_fatal 30575 1726867659.84832: done checking for any_errors_fatal 30575 1726867659.84833: checking for max_fail_percentage 30575 1726867659.84834: done checking for max_fail_percentage 30575 1726867659.84835: checking to see if all hosts have failed and the running result is not ok 30575 1726867659.84835: done checking to see if all hosts have failed 30575 1726867659.84836: getting the remaining hosts for this loop 30575 1726867659.84837: done getting the remaining hosts for this loop 30575 1726867659.84839: getting the next task for host managed_node3 30575 1726867659.84844: done getting next task for host managed_node3 30575 1726867659.84846: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867659.84850: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867659.84858: getting variables 30575 1726867659.84859: in VariableManager get_vars() 30575 1726867659.84884: Calling all_inventory to load vars for managed_node3 30575 1726867659.84886: Calling groups_inventory to load vars for managed_node3 30575 1726867659.84887: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867659.84893: Calling all_plugins_play to load vars for managed_node3 30575 1726867659.84898: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867659.84900: Calling groups_plugins_play to load vars for managed_node3 30575 1726867659.85832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867659.86965: done with get_vars() 30575 1726867659.86983: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:39 -0400 (0:00:01.748) 0:01:35.248 ****** 30575 1726867659.87089: entering _queue_task() for managed_node3/package_facts 30575 1726867659.87341: worker is 1 (out of 1 available) 30575 1726867659.87354: exiting _queue_task() for managed_node3/package_facts 30575 1726867659.87371: done queuing things up, now waiting for results queue to drain 30575 1726867659.87373: waiting for pending results... 30575 1726867659.87633: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867659.87792: in run() - task 0affcac9-a3a5-e081-a588-000000001d88 30575 1726867659.87796: variable 'ansible_search_path' from source: unknown 30575 1726867659.87799: variable 'ansible_search_path' from source: unknown 30575 1726867659.87831: calling self._execute() 30575 1726867659.87903: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867659.87906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867659.87918: variable 'omit' from source: magic vars 30575 1726867659.88209: variable 'ansible_distribution_major_version' from source: facts 30575 1726867659.88221: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867659.88224: variable 'omit' from source: magic vars 30575 1726867659.88270: variable 'omit' from source: magic vars 30575 1726867659.88298: variable 'omit' from source: magic vars 30575 1726867659.88329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867659.88354: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867659.88370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867659.88386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867659.88401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867659.88423: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867659.88426: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867659.88428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867659.88497: Set connection var ansible_pipelining to False 30575 1726867659.88500: Set connection var ansible_shell_type to sh 30575 1726867659.88504: Set connection var ansible_shell_executable to /bin/sh 30575 1726867659.88513: Set connection var ansible_timeout to 10 30575 1726867659.88519: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867659.88522: Set connection var ansible_connection to ssh 30575 1726867659.88540: variable 'ansible_shell_executable' from source: unknown 30575 1726867659.88542: variable 'ansible_connection' from source: unknown 30575 1726867659.88545: variable 'ansible_module_compression' from source: unknown 30575 1726867659.88548: variable 'ansible_shell_type' from source: unknown 30575 1726867659.88550: variable 'ansible_shell_executable' from source: unknown 30575 1726867659.88552: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867659.88554: variable 'ansible_pipelining' from source: unknown 30575 1726867659.88556: variable 'ansible_timeout' from source: unknown 30575 1726867659.88561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867659.88700: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867659.88709: variable 'omit' from source: magic vars 30575 1726867659.88714: starting attempt loop 30575 1726867659.88720: running the handler 30575 1726867659.88731: _low_level_execute_command(): starting 30575 1726867659.88737: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867659.89293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867659.89297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867659.89314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867659.89368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867659.90948: stdout chunk (state=3): >>>/root <<< 30575 1726867659.91057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867659.91079: stderr chunk (state=3): >>><<< 30575 1726867659.91083: stdout chunk (state=3): >>><<< 30575 1726867659.91098: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867659.91112: _low_level_execute_command(): starting 30575 1726867659.91122: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858 `" && echo ansible-tmp-1726867659.9109836-35098-103012313935858="` echo /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858 `" ) && sleep 0' 30575 1726867659.91546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867659.91550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867659.91553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867659.91557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867659.91565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867659.91608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867659.91612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867659.91664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867659.93528: stdout chunk (state=3): >>>ansible-tmp-1726867659.9109836-35098-103012313935858=/root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858 <<< 30575 1726867659.93641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867659.93662: stderr chunk (state=3): >>><<< 30575 1726867659.93665: stdout chunk (state=3): >>><<< 30575 1726867659.93676: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867659.9109836-35098-103012313935858=/root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867659.93710: variable 'ansible_module_compression' from source: unknown 30575 1726867659.93751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867659.93800: variable 'ansible_facts' from source: unknown 30575 1726867659.93948: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/AnsiballZ_package_facts.py 30575 1726867659.94045: Sending initial data 30575 1726867659.94052: Sent initial data (162 bytes) 30575 1726867659.94594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867659.94597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867659.94603: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867659.94625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867659.94658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867659.94704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867659.96227: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867659.96231: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867659.96267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867659.96318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9kc0anvd /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/AnsiballZ_package_facts.py <<< 30575 1726867659.96324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/AnsiballZ_package_facts.py" <<< 30575 1726867659.96360: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9kc0anvd" to remote "/root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/AnsiballZ_package_facts.py" <<< 30575 1726867659.97611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867659.97652: stderr chunk (state=3): >>><<< 30575 1726867659.97655: stdout chunk (state=3): >>><<< 30575 1726867659.97680: done transferring module to remote 30575 1726867659.97689: _low_level_execute_command(): starting 30575 1726867659.97694: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/ /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/AnsiballZ_package_facts.py && sleep 0' 30575 1726867659.98160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867659.98271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867659.98275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867659.98280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867659.98282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867659.98285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867659.98292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867659.98341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867659.98388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867660.00183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867660.00186: stdout chunk (state=3): >>><<< 30575 1726867660.00188: stderr chunk (state=3): >>><<< 30575 1726867660.00368: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867660.00371: _low_level_execute_command(): starting 30575 1726867660.00373: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/AnsiballZ_package_facts.py && sleep 0' 30575 1726867660.00960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867660.00963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867660.00975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867660.00983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867660.01003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867660.01007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867660.01058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867660.01065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867660.01122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867660.45087: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30575 1726867660.45109: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867660.45136: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30575 1726867660.45169: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867660.45199: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867660.46889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867660.46920: stderr chunk (state=3): >>><<< 30575 1726867660.46923: stdout chunk (state=3): >>><<< 30575 1726867660.46963: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867660.54293: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867660.54307: _low_level_execute_command(): starting 30575 1726867660.54310: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867659.9109836-35098-103012313935858/ > /dev/null 2>&1 && sleep 0' 30575 1726867660.54971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867660.54975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867660.55050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867660.55084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867660.55119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867660.56990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867660.57012: stderr chunk (state=3): >>><<< 30575 1726867660.57018: stdout chunk (state=3): >>><<< 30575 1726867660.57028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867660.57034: handler run complete 30575 1726867660.57495: variable 'ansible_facts' from source: unknown 30575 1726867660.57805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.58855: variable 'ansible_facts' from source: unknown 30575 1726867660.59094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.59487: attempt loop complete, returning result 30575 1726867660.59490: _execute() done 30575 1726867660.59493: dumping result to json 30575 1726867660.59610: done dumping result, returning 30575 1726867660.59619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000001d88] 30575 1726867660.59621: sending task result for task 0affcac9-a3a5-e081-a588-000000001d88 30575 1726867660.65005: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d88 30575 1726867660.65010: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867660.65051: no more pending results, returning what we have 30575 1726867660.65052: results queue empty 30575 1726867660.65053: checking for any_errors_fatal 30575 1726867660.65055: done checking for any_errors_fatal 30575 1726867660.65056: checking for max_fail_percentage 30575 1726867660.65056: done checking for max_fail_percentage 30575 1726867660.65057: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.65058: done checking to see if all hosts have failed 30575 1726867660.65058: getting the remaining hosts for this loop 30575 1726867660.65059: done getting the remaining hosts for this loop 30575 1726867660.65061: getting the next task for host managed_node3 30575 1726867660.65064: done getting next task for host managed_node3 30575 1726867660.65066: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867660.65070: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.65075: getting variables 30575 1726867660.65076: in VariableManager get_vars() 30575 1726867660.65091: Calling all_inventory to load vars for managed_node3 30575 1726867660.65093: Calling groups_inventory to load vars for managed_node3 30575 1726867660.65094: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.65098: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.65099: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.65101: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.65735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.66589: done with get_vars() 30575 1726867660.66603: done getting variables 30575 1726867660.66641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:40 -0400 (0:00:00.795) 0:01:36.044 ****** 30575 1726867660.66666: entering _queue_task() for managed_node3/debug 30575 1726867660.66945: worker is 1 (out of 1 available) 30575 1726867660.66958: exiting _queue_task() for managed_node3/debug 30575 1726867660.66971: done queuing things up, now waiting for results queue to drain 30575 1726867660.66973: waiting for pending results... 30575 1726867660.67158: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867660.67274: in run() - task 0affcac9-a3a5-e081-a588-000000001d2c 30575 1726867660.67288: variable 'ansible_search_path' from source: unknown 30575 1726867660.67292: variable 'ansible_search_path' from source: unknown 30575 1726867660.67327: calling self._execute() 30575 1726867660.67404: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.67407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.67429: variable 'omit' from source: magic vars 30575 1726867660.67707: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.67718: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867660.67722: variable 'omit' from source: magic vars 30575 1726867660.67769: variable 'omit' from source: magic vars 30575 1726867660.67841: variable 'network_provider' from source: set_fact 30575 1726867660.67855: variable 'omit' from source: magic vars 30575 1726867660.67896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867660.67925: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867660.67943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867660.67957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867660.67974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867660.67997: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867660.68000: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.68002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.68071: Set connection var ansible_pipelining to False 30575 1726867660.68084: Set connection var ansible_shell_type to sh 30575 1726867660.68090: Set connection var ansible_shell_executable to /bin/sh 30575 1726867660.68096: Set connection var ansible_timeout to 10 30575 1726867660.68101: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867660.68107: Set connection var ansible_connection to ssh 30575 1726867660.68126: variable 'ansible_shell_executable' from source: unknown 30575 1726867660.68130: variable 'ansible_connection' from source: unknown 30575 1726867660.68132: variable 'ansible_module_compression' from source: unknown 30575 1726867660.68135: variable 'ansible_shell_type' from source: unknown 30575 1726867660.68137: variable 'ansible_shell_executable' from source: unknown 30575 1726867660.68139: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.68141: variable 'ansible_pipelining' from source: unknown 30575 1726867660.68144: variable 'ansible_timeout' from source: unknown 30575 1726867660.68148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.68251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867660.68261: variable 'omit' from source: magic vars 30575 1726867660.68264: starting attempt loop 30575 1726867660.68267: running the handler 30575 1726867660.68307: handler run complete 30575 1726867660.68320: attempt loop complete, returning result 30575 1726867660.68323: _execute() done 30575 1726867660.68326: dumping result to json 30575 1726867660.68329: done dumping result, returning 30575 1726867660.68332: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000001d2c] 30575 1726867660.68338: sending task result for task 0affcac9-a3a5-e081-a588-000000001d2c 30575 1726867660.68419: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d2c 30575 1726867660.68422: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867660.68488: no more pending results, returning what we have 30575 1726867660.68492: results queue empty 30575 1726867660.68492: checking for any_errors_fatal 30575 1726867660.68503: done checking for any_errors_fatal 30575 1726867660.68503: checking for max_fail_percentage 30575 1726867660.68505: done checking for max_fail_percentage 30575 1726867660.68506: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.68507: done checking to see if all hosts have failed 30575 1726867660.68507: getting the remaining hosts for this loop 30575 1726867660.68509: done getting the remaining hosts for this loop 30575 1726867660.68512: getting the next task for host managed_node3 30575 1726867660.68522: done getting next task for host managed_node3 30575 1726867660.68526: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867660.68532: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.68543: getting variables 30575 1726867660.68545: in VariableManager get_vars() 30575 1726867660.68589: Calling all_inventory to load vars for managed_node3 30575 1726867660.68591: Calling groups_inventory to load vars for managed_node3 30575 1726867660.68593: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.68602: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.68604: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.68606: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.69464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.70333: done with get_vars() 30575 1726867660.70348: done getting variables 30575 1726867660.70387: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:40 -0400 (0:00:00.037) 0:01:36.081 ****** 30575 1726867660.70414: entering _queue_task() for managed_node3/fail 30575 1726867660.70626: worker is 1 (out of 1 available) 30575 1726867660.70639: exiting _queue_task() for managed_node3/fail 30575 1726867660.70651: done queuing things up, now waiting for results queue to drain 30575 1726867660.70652: waiting for pending results... 30575 1726867660.70847: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867660.70963: in run() - task 0affcac9-a3a5-e081-a588-000000001d2d 30575 1726867660.70974: variable 'ansible_search_path' from source: unknown 30575 1726867660.70980: variable 'ansible_search_path' from source: unknown 30575 1726867660.71012: calling self._execute() 30575 1726867660.71093: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.71098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.71111: variable 'omit' from source: magic vars 30575 1726867660.71395: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.71404: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867660.71497: variable 'network_state' from source: role '' defaults 30575 1726867660.71504: Evaluated conditional (network_state != {}): False 30575 1726867660.71508: when evaluation is False, skipping this task 30575 1726867660.71511: _execute() done 30575 1726867660.71513: dumping result to json 30575 1726867660.71516: done dumping result, returning 30575 1726867660.71526: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-000000001d2d] 30575 1726867660.71530: sending task result for task 0affcac9-a3a5-e081-a588-000000001d2d 30575 1726867660.71613: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d2d 30575 1726867660.71615: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867660.71695: no more pending results, returning what we have 30575 1726867660.71698: results queue empty 30575 1726867660.71698: checking for any_errors_fatal 30575 1726867660.71703: done checking for any_errors_fatal 30575 1726867660.71704: checking for max_fail_percentage 30575 1726867660.71706: done checking for max_fail_percentage 30575 1726867660.71706: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.71707: done checking to see if all hosts have failed 30575 1726867660.71708: getting the remaining hosts for this loop 30575 1726867660.71709: done getting the remaining hosts for this loop 30575 1726867660.71712: getting the next task for host managed_node3 30575 1726867660.71718: done getting next task for host managed_node3 30575 1726867660.71722: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867660.71726: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.71745: getting variables 30575 1726867660.71746: in VariableManager get_vars() 30575 1726867660.71779: Calling all_inventory to load vars for managed_node3 30575 1726867660.71781: Calling groups_inventory to load vars for managed_node3 30575 1726867660.71784: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.71796: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.71798: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.71800: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.72530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.73427: done with get_vars() 30575 1726867660.73442: done getting variables 30575 1726867660.73480: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:40 -0400 (0:00:00.030) 0:01:36.112 ****** 30575 1726867660.73504: entering _queue_task() for managed_node3/fail 30575 1726867660.73696: worker is 1 (out of 1 available) 30575 1726867660.73709: exiting _queue_task() for managed_node3/fail 30575 1726867660.73723: done queuing things up, now waiting for results queue to drain 30575 1726867660.73725: waiting for pending results... 30575 1726867660.73903: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867660.74018: in run() - task 0affcac9-a3a5-e081-a588-000000001d2e 30575 1726867660.74026: variable 'ansible_search_path' from source: unknown 30575 1726867660.74030: variable 'ansible_search_path' from source: unknown 30575 1726867660.74059: calling self._execute() 30575 1726867660.74138: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.74141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.74150: variable 'omit' from source: magic vars 30575 1726867660.74427: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.74436: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867660.74526: variable 'network_state' from source: role '' defaults 30575 1726867660.74534: Evaluated conditional (network_state != {}): False 30575 1726867660.74537: when evaluation is False, skipping this task 30575 1726867660.74541: _execute() done 30575 1726867660.74543: dumping result to json 30575 1726867660.74546: done dumping result, returning 30575 1726867660.74554: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-000000001d2e] 30575 1726867660.74558: sending task result for task 0affcac9-a3a5-e081-a588-000000001d2e 30575 1726867660.74643: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d2e 30575 1726867660.74646: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867660.74692: no more pending results, returning what we have 30575 1726867660.74696: results queue empty 30575 1726867660.74696: checking for any_errors_fatal 30575 1726867660.74702: done checking for any_errors_fatal 30575 1726867660.74703: checking for max_fail_percentage 30575 1726867660.74704: done checking for max_fail_percentage 30575 1726867660.74705: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.74706: done checking to see if all hosts have failed 30575 1726867660.74707: getting the remaining hosts for this loop 30575 1726867660.74708: done getting the remaining hosts for this loop 30575 1726867660.74711: getting the next task for host managed_node3 30575 1726867660.74719: done getting next task for host managed_node3 30575 1726867660.74722: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867660.74727: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.74746: getting variables 30575 1726867660.74748: in VariableManager get_vars() 30575 1726867660.74783: Calling all_inventory to load vars for managed_node3 30575 1726867660.74785: Calling groups_inventory to load vars for managed_node3 30575 1726867660.74787: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.74795: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.74797: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.74799: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.75693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.76559: done with get_vars() 30575 1726867660.76578: done getting variables 30575 1726867660.76619: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:40 -0400 (0:00:00.031) 0:01:36.144 ****** 30575 1726867660.76644: entering _queue_task() for managed_node3/fail 30575 1726867660.76867: worker is 1 (out of 1 available) 30575 1726867660.76879: exiting _queue_task() for managed_node3/fail 30575 1726867660.76891: done queuing things up, now waiting for results queue to drain 30575 1726867660.76892: waiting for pending results... 30575 1726867660.77079: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867660.77186: in run() - task 0affcac9-a3a5-e081-a588-000000001d2f 30575 1726867660.77197: variable 'ansible_search_path' from source: unknown 30575 1726867660.77201: variable 'ansible_search_path' from source: unknown 30575 1726867660.77237: calling self._execute() 30575 1726867660.77305: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.77308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.77319: variable 'omit' from source: magic vars 30575 1726867660.77603: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.77612: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867660.77735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867660.79258: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867660.79307: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867660.79335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867660.79361: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867660.79382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867660.79441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.79471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.79491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.79525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.79534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.79601: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.79614: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867660.79692: variable 'ansible_distribution' from source: facts 30575 1726867660.79696: variable '__network_rh_distros' from source: role '' defaults 30575 1726867660.79703: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867660.79862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.79882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.79898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.79924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.79934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.79972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.79990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.80006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.80031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.80041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.80074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.80092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.80108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.80133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.80144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.80333: variable 'network_connections' from source: include params 30575 1726867660.80341: variable 'interface' from source: play vars 30575 1726867660.80390: variable 'interface' from source: play vars 30575 1726867660.80398: variable 'network_state' from source: role '' defaults 30575 1726867660.80444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867660.80554: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867660.80582: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867660.80605: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867660.80630: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867660.80661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867660.80676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867660.80698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.80723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867660.80739: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867660.80742: when evaluation is False, skipping this task 30575 1726867660.80745: _execute() done 30575 1726867660.80748: dumping result to json 30575 1726867660.80750: done dumping result, returning 30575 1726867660.80757: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000001d2f] 30575 1726867660.80762: sending task result for task 0affcac9-a3a5-e081-a588-000000001d2f 30575 1726867660.80844: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d2f 30575 1726867660.80847: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867660.80889: no more pending results, returning what we have 30575 1726867660.80893: results queue empty 30575 1726867660.80893: checking for any_errors_fatal 30575 1726867660.80898: done checking for any_errors_fatal 30575 1726867660.80899: checking for max_fail_percentage 30575 1726867660.80901: done checking for max_fail_percentage 30575 1726867660.80902: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.80903: done checking to see if all hosts have failed 30575 1726867660.80903: getting the remaining hosts for this loop 30575 1726867660.80905: done getting the remaining hosts for this loop 30575 1726867660.80908: getting the next task for host managed_node3 30575 1726867660.80919: done getting next task for host managed_node3 30575 1726867660.80922: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867660.80926: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.80949: getting variables 30575 1726867660.80950: in VariableManager get_vars() 30575 1726867660.80996: Calling all_inventory to load vars for managed_node3 30575 1726867660.80999: Calling groups_inventory to load vars for managed_node3 30575 1726867660.81001: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.81009: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.81012: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.81014: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.81814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.82700: done with get_vars() 30575 1726867660.82717: done getting variables 30575 1726867660.82759: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:40 -0400 (0:00:00.061) 0:01:36.205 ****** 30575 1726867660.82784: entering _queue_task() for managed_node3/dnf 30575 1726867660.82990: worker is 1 (out of 1 available) 30575 1726867660.83003: exiting _queue_task() for managed_node3/dnf 30575 1726867660.83018: done queuing things up, now waiting for results queue to drain 30575 1726867660.83020: waiting for pending results... 30575 1726867660.83195: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867660.83293: in run() - task 0affcac9-a3a5-e081-a588-000000001d30 30575 1726867660.83303: variable 'ansible_search_path' from source: unknown 30575 1726867660.83306: variable 'ansible_search_path' from source: unknown 30575 1726867660.83336: calling self._execute() 30575 1726867660.83407: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.83411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.83420: variable 'omit' from source: magic vars 30575 1726867660.83682: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.83692: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867660.83827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867660.85522: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867660.85564: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867660.85591: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867660.85618: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867660.85638: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867660.85694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.85714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.85732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.85764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.85772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.85849: variable 'ansible_distribution' from source: facts 30575 1726867660.85853: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.85867: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867660.85940: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867660.86026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.86043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.86059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.86094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.86102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.86130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.86145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.86161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.86188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.86199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.86228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.86243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.86259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.86284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.86296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.86407: variable 'network_connections' from source: include params 30575 1726867660.86419: variable 'interface' from source: play vars 30575 1726867660.86461: variable 'interface' from source: play vars 30575 1726867660.86512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867660.86619: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867660.86647: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867660.86670: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867660.86692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867660.86722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867660.86740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867660.86763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.86781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867660.86814: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867660.86959: variable 'network_connections' from source: include params 30575 1726867660.86962: variable 'interface' from source: play vars 30575 1726867660.87007: variable 'interface' from source: play vars 30575 1726867660.87025: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867660.87028: when evaluation is False, skipping this task 30575 1726867660.87031: _execute() done 30575 1726867660.87033: dumping result to json 30575 1726867660.87036: done dumping result, returning 30575 1726867660.87043: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001d30] 30575 1726867660.87047: sending task result for task 0affcac9-a3a5-e081-a588-000000001d30 30575 1726867660.87133: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d30 30575 1726867660.87136: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867660.87219: no more pending results, returning what we have 30575 1726867660.87222: results queue empty 30575 1726867660.87223: checking for any_errors_fatal 30575 1726867660.87229: done checking for any_errors_fatal 30575 1726867660.87230: checking for max_fail_percentage 30575 1726867660.87231: done checking for max_fail_percentage 30575 1726867660.87232: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.87233: done checking to see if all hosts have failed 30575 1726867660.87233: getting the remaining hosts for this loop 30575 1726867660.87235: done getting the remaining hosts for this loop 30575 1726867660.87238: getting the next task for host managed_node3 30575 1726867660.87246: done getting next task for host managed_node3 30575 1726867660.87249: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867660.87254: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.87275: getting variables 30575 1726867660.87276: in VariableManager get_vars() 30575 1726867660.87312: Calling all_inventory to load vars for managed_node3 30575 1726867660.87317: Calling groups_inventory to load vars for managed_node3 30575 1726867660.87319: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.87326: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.87329: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.87331: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.88182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.89051: done with get_vars() 30575 1726867660.89066: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867660.89119: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:40 -0400 (0:00:00.063) 0:01:36.268 ****** 30575 1726867660.89142: entering _queue_task() for managed_node3/yum 30575 1726867660.89335: worker is 1 (out of 1 available) 30575 1726867660.89348: exiting _queue_task() for managed_node3/yum 30575 1726867660.89360: done queuing things up, now waiting for results queue to drain 30575 1726867660.89362: waiting for pending results... 30575 1726867660.89548: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867660.89650: in run() - task 0affcac9-a3a5-e081-a588-000000001d31 30575 1726867660.89662: variable 'ansible_search_path' from source: unknown 30575 1726867660.89666: variable 'ansible_search_path' from source: unknown 30575 1726867660.89697: calling self._execute() 30575 1726867660.89772: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.89776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.89786: variable 'omit' from source: magic vars 30575 1726867660.90068: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.90079: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867660.90198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867660.91725: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867660.91768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867660.91798: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867660.91824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867660.91844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867660.91906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.91941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.91960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.91990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.92002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.92067: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.92080: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867660.92084: when evaluation is False, skipping this task 30575 1726867660.92087: _execute() done 30575 1726867660.92091: dumping result to json 30575 1726867660.92094: done dumping result, returning 30575 1726867660.92103: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001d31] 30575 1726867660.92107: sending task result for task 0affcac9-a3a5-e081-a588-000000001d31 30575 1726867660.92190: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d31 30575 1726867660.92193: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867660.92248: no more pending results, returning what we have 30575 1726867660.92251: results queue empty 30575 1726867660.92252: checking for any_errors_fatal 30575 1726867660.92259: done checking for any_errors_fatal 30575 1726867660.92260: checking for max_fail_percentage 30575 1726867660.92262: done checking for max_fail_percentage 30575 1726867660.92263: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.92264: done checking to see if all hosts have failed 30575 1726867660.92264: getting the remaining hosts for this loop 30575 1726867660.92266: done getting the remaining hosts for this loop 30575 1726867660.92269: getting the next task for host managed_node3 30575 1726867660.92280: done getting next task for host managed_node3 30575 1726867660.92283: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867660.92287: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.92309: getting variables 30575 1726867660.92311: in VariableManager get_vars() 30575 1726867660.92349: Calling all_inventory to load vars for managed_node3 30575 1726867660.92351: Calling groups_inventory to load vars for managed_node3 30575 1726867660.92353: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.92361: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.92364: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.92366: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.93158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867660.94142: done with get_vars() 30575 1726867660.94157: done getting variables 30575 1726867660.94196: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:40 -0400 (0:00:00.050) 0:01:36.319 ****** 30575 1726867660.94220: entering _queue_task() for managed_node3/fail 30575 1726867660.94427: worker is 1 (out of 1 available) 30575 1726867660.94439: exiting _queue_task() for managed_node3/fail 30575 1726867660.94451: done queuing things up, now waiting for results queue to drain 30575 1726867660.94454: waiting for pending results... 30575 1726867660.94637: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867660.94736: in run() - task 0affcac9-a3a5-e081-a588-000000001d32 30575 1726867660.94748: variable 'ansible_search_path' from source: unknown 30575 1726867660.94752: variable 'ansible_search_path' from source: unknown 30575 1726867660.94780: calling self._execute() 30575 1726867660.94853: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867660.94857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867660.94865: variable 'omit' from source: magic vars 30575 1726867660.95145: variable 'ansible_distribution_major_version' from source: facts 30575 1726867660.95154: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867660.95242: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867660.95373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867660.96859: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867660.96902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867660.96930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867660.96955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867660.96979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867660.97037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.97074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.97090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.97116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.97129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.97160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.97178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.97197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.97225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.97235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.97263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867660.97280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867660.97302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.97327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867660.97338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867660.97452: variable 'network_connections' from source: include params 30575 1726867660.97460: variable 'interface' from source: play vars 30575 1726867660.97507: variable 'interface' from source: play vars 30575 1726867660.97559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867660.97669: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867660.97697: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867660.97723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867660.97747: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867660.97779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867660.97794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867660.97811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867660.97831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867660.97871: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867660.98026: variable 'network_connections' from source: include params 30575 1726867660.98030: variable 'interface' from source: play vars 30575 1726867660.98076: variable 'interface' from source: play vars 30575 1726867660.98096: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867660.98099: when evaluation is False, skipping this task 30575 1726867660.98102: _execute() done 30575 1726867660.98105: dumping result to json 30575 1726867660.98107: done dumping result, returning 30575 1726867660.98114: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001d32] 30575 1726867660.98121: sending task result for task 0affcac9-a3a5-e081-a588-000000001d32 30575 1726867660.98206: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d32 30575 1726867660.98208: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867660.98257: no more pending results, returning what we have 30575 1726867660.98261: results queue empty 30575 1726867660.98261: checking for any_errors_fatal 30575 1726867660.98268: done checking for any_errors_fatal 30575 1726867660.98269: checking for max_fail_percentage 30575 1726867660.98271: done checking for max_fail_percentage 30575 1726867660.98272: checking to see if all hosts have failed and the running result is not ok 30575 1726867660.98273: done checking to see if all hosts have failed 30575 1726867660.98273: getting the remaining hosts for this loop 30575 1726867660.98275: done getting the remaining hosts for this loop 30575 1726867660.98281: getting the next task for host managed_node3 30575 1726867660.98289: done getting next task for host managed_node3 30575 1726867660.98293: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867660.98297: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867660.98330: getting variables 30575 1726867660.98332: in VariableManager get_vars() 30575 1726867660.98368: Calling all_inventory to load vars for managed_node3 30575 1726867660.98370: Calling groups_inventory to load vars for managed_node3 30575 1726867660.98372: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867660.98383: Calling all_plugins_play to load vars for managed_node3 30575 1726867660.98385: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867660.98388: Calling groups_plugins_play to load vars for managed_node3 30575 1726867660.99194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867661.00081: done with get_vars() 30575 1726867661.00097: done getting variables 30575 1726867661.00137: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:41 -0400 (0:00:00.059) 0:01:36.379 ****** 30575 1726867661.00162: entering _queue_task() for managed_node3/package 30575 1726867661.00378: worker is 1 (out of 1 available) 30575 1726867661.00391: exiting _queue_task() for managed_node3/package 30575 1726867661.00403: done queuing things up, now waiting for results queue to drain 30575 1726867661.00404: waiting for pending results... 30575 1726867661.00590: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867661.00694: in run() - task 0affcac9-a3a5-e081-a588-000000001d33 30575 1726867661.00705: variable 'ansible_search_path' from source: unknown 30575 1726867661.00709: variable 'ansible_search_path' from source: unknown 30575 1726867661.00738: calling self._execute() 30575 1726867661.00812: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.00818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.00825: variable 'omit' from source: magic vars 30575 1726867661.01104: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.01113: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867661.01243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867661.01440: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867661.01470: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867661.01496: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867661.01551: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867661.01634: variable 'network_packages' from source: role '' defaults 30575 1726867661.01706: variable '__network_provider_setup' from source: role '' defaults 30575 1726867661.01717: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867661.01763: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867661.01771: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867661.01814: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867661.01931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867661.03518: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867661.03555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867661.03584: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867661.03608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867661.03629: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867661.03684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.03709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.03729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.03754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.03764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.03798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.03820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.03835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.03859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.03869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.04007: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867661.04079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.04096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.04112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.04141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.04152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.04210: variable 'ansible_python' from source: facts 30575 1726867661.04224: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867661.04281: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867661.04334: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867661.04419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.04433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.04450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.04481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.04493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.04524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.04543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.04559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.04591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.04602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.04694: variable 'network_connections' from source: include params 30575 1726867661.04698: variable 'interface' from source: play vars 30575 1726867661.04766: variable 'interface' from source: play vars 30575 1726867661.04820: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867661.04837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867661.04857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.04880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867661.04982: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867661.05098: variable 'network_connections' from source: include params 30575 1726867661.05101: variable 'interface' from source: play vars 30575 1726867661.05173: variable 'interface' from source: play vars 30575 1726867661.05197: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867661.05255: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867661.05462: variable 'network_connections' from source: include params 30575 1726867661.05465: variable 'interface' from source: play vars 30575 1726867661.05512: variable 'interface' from source: play vars 30575 1726867661.05531: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867661.05588: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867661.05781: variable 'network_connections' from source: include params 30575 1726867661.05785: variable 'interface' from source: play vars 30575 1726867661.05831: variable 'interface' from source: play vars 30575 1726867661.05866: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867661.05911: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867661.05919: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867661.05960: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867661.06100: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867661.06386: variable 'network_connections' from source: include params 30575 1726867661.06390: variable 'interface' from source: play vars 30575 1726867661.06436: variable 'interface' from source: play vars 30575 1726867661.06443: variable 'ansible_distribution' from source: facts 30575 1726867661.06445: variable '__network_rh_distros' from source: role '' defaults 30575 1726867661.06452: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.06463: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867661.06570: variable 'ansible_distribution' from source: facts 30575 1726867661.06574: variable '__network_rh_distros' from source: role '' defaults 30575 1726867661.06579: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.06590: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867661.06698: variable 'ansible_distribution' from source: facts 30575 1726867661.06701: variable '__network_rh_distros' from source: role '' defaults 30575 1726867661.06705: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.06733: variable 'network_provider' from source: set_fact 30575 1726867661.06743: variable 'ansible_facts' from source: unknown 30575 1726867661.07094: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867661.07097: when evaluation is False, skipping this task 30575 1726867661.07100: _execute() done 30575 1726867661.07102: dumping result to json 30575 1726867661.07104: done dumping result, returning 30575 1726867661.07113: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000001d33] 30575 1726867661.07120: sending task result for task 0affcac9-a3a5-e081-a588-000000001d33 30575 1726867661.07213: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d33 30575 1726867661.07216: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867661.07262: no more pending results, returning what we have 30575 1726867661.07265: results queue empty 30575 1726867661.07266: checking for any_errors_fatal 30575 1726867661.07273: done checking for any_errors_fatal 30575 1726867661.07273: checking for max_fail_percentage 30575 1726867661.07275: done checking for max_fail_percentage 30575 1726867661.07276: checking to see if all hosts have failed and the running result is not ok 30575 1726867661.07278: done checking to see if all hosts have failed 30575 1726867661.07279: getting the remaining hosts for this loop 30575 1726867661.07281: done getting the remaining hosts for this loop 30575 1726867661.07285: getting the next task for host managed_node3 30575 1726867661.07295: done getting next task for host managed_node3 30575 1726867661.07299: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867661.07304: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867661.07328: getting variables 30575 1726867661.07330: in VariableManager get_vars() 30575 1726867661.07372: Calling all_inventory to load vars for managed_node3 30575 1726867661.07375: Calling groups_inventory to load vars for managed_node3 30575 1726867661.07386: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867661.07395: Calling all_plugins_play to load vars for managed_node3 30575 1726867661.07398: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867661.07400: Calling groups_plugins_play to load vars for managed_node3 30575 1726867661.08364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867661.09235: done with get_vars() 30575 1726867661.09250: done getting variables 30575 1726867661.09293: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:41 -0400 (0:00:00.091) 0:01:36.470 ****** 30575 1726867661.09319: entering _queue_task() for managed_node3/package 30575 1726867661.09531: worker is 1 (out of 1 available) 30575 1726867661.09544: exiting _queue_task() for managed_node3/package 30575 1726867661.09555: done queuing things up, now waiting for results queue to drain 30575 1726867661.09557: waiting for pending results... 30575 1726867661.09738: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867661.09834: in run() - task 0affcac9-a3a5-e081-a588-000000001d34 30575 1726867661.09845: variable 'ansible_search_path' from source: unknown 30575 1726867661.09848: variable 'ansible_search_path' from source: unknown 30575 1726867661.09878: calling self._execute() 30575 1726867661.09957: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.09960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.09969: variable 'omit' from source: magic vars 30575 1726867661.10250: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.10259: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867661.10343: variable 'network_state' from source: role '' defaults 30575 1726867661.10352: Evaluated conditional (network_state != {}): False 30575 1726867661.10356: when evaluation is False, skipping this task 30575 1726867661.10359: _execute() done 30575 1726867661.10361: dumping result to json 30575 1726867661.10363: done dumping result, returning 30575 1726867661.10372: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001d34] 30575 1726867661.10378: sending task result for task 0affcac9-a3a5-e081-a588-000000001d34 30575 1726867661.10467: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d34 30575 1726867661.10469: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867661.10518: no more pending results, returning what we have 30575 1726867661.10522: results queue empty 30575 1726867661.10522: checking for any_errors_fatal 30575 1726867661.10529: done checking for any_errors_fatal 30575 1726867661.10529: checking for max_fail_percentage 30575 1726867661.10531: done checking for max_fail_percentage 30575 1726867661.10532: checking to see if all hosts have failed and the running result is not ok 30575 1726867661.10533: done checking to see if all hosts have failed 30575 1726867661.10533: getting the remaining hosts for this loop 30575 1726867661.10534: done getting the remaining hosts for this loop 30575 1726867661.10538: getting the next task for host managed_node3 30575 1726867661.10544: done getting next task for host managed_node3 30575 1726867661.10548: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867661.10552: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867661.10572: getting variables 30575 1726867661.10574: in VariableManager get_vars() 30575 1726867661.10617: Calling all_inventory to load vars for managed_node3 30575 1726867661.10620: Calling groups_inventory to load vars for managed_node3 30575 1726867661.10622: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867661.10629: Calling all_plugins_play to load vars for managed_node3 30575 1726867661.10631: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867661.10634: Calling groups_plugins_play to load vars for managed_node3 30575 1726867661.11369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867661.12247: done with get_vars() 30575 1726867661.12262: done getting variables 30575 1726867661.12301: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:41 -0400 (0:00:00.030) 0:01:36.500 ****** 30575 1726867661.12327: entering _queue_task() for managed_node3/package 30575 1726867661.12514: worker is 1 (out of 1 available) 30575 1726867661.12527: exiting _queue_task() for managed_node3/package 30575 1726867661.12539: done queuing things up, now waiting for results queue to drain 30575 1726867661.12541: waiting for pending results... 30575 1726867661.12721: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867661.12826: in run() - task 0affcac9-a3a5-e081-a588-000000001d35 30575 1726867661.12837: variable 'ansible_search_path' from source: unknown 30575 1726867661.12841: variable 'ansible_search_path' from source: unknown 30575 1726867661.12874: calling self._execute() 30575 1726867661.12947: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.12951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.12962: variable 'omit' from source: magic vars 30575 1726867661.13233: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.13242: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867661.13328: variable 'network_state' from source: role '' defaults 30575 1726867661.13337: Evaluated conditional (network_state != {}): False 30575 1726867661.13340: when evaluation is False, skipping this task 30575 1726867661.13342: _execute() done 30575 1726867661.13345: dumping result to json 30575 1726867661.13350: done dumping result, returning 30575 1726867661.13357: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000001d35] 30575 1726867661.13362: sending task result for task 0affcac9-a3a5-e081-a588-000000001d35 30575 1726867661.13450: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d35 30575 1726867661.13453: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867661.13501: no more pending results, returning what we have 30575 1726867661.13504: results queue empty 30575 1726867661.13505: checking for any_errors_fatal 30575 1726867661.13510: done checking for any_errors_fatal 30575 1726867661.13511: checking for max_fail_percentage 30575 1726867661.13512: done checking for max_fail_percentage 30575 1726867661.13513: checking to see if all hosts have failed and the running result is not ok 30575 1726867661.13514: done checking to see if all hosts have failed 30575 1726867661.13515: getting the remaining hosts for this loop 30575 1726867661.13516: done getting the remaining hosts for this loop 30575 1726867661.13519: getting the next task for host managed_node3 30575 1726867661.13526: done getting next task for host managed_node3 30575 1726867661.13529: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867661.13533: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867661.13552: getting variables 30575 1726867661.13553: in VariableManager get_vars() 30575 1726867661.13598: Calling all_inventory to load vars for managed_node3 30575 1726867661.13600: Calling groups_inventory to load vars for managed_node3 30575 1726867661.13603: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867661.13610: Calling all_plugins_play to load vars for managed_node3 30575 1726867661.13612: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867661.13614: Calling groups_plugins_play to load vars for managed_node3 30575 1726867661.14487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867661.15361: done with get_vars() 30575 1726867661.15375: done getting variables 30575 1726867661.15414: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:41 -0400 (0:00:00.031) 0:01:36.531 ****** 30575 1726867661.15443: entering _queue_task() for managed_node3/service 30575 1726867661.15635: worker is 1 (out of 1 available) 30575 1726867661.15647: exiting _queue_task() for managed_node3/service 30575 1726867661.15661: done queuing things up, now waiting for results queue to drain 30575 1726867661.15663: waiting for pending results... 30575 1726867661.15840: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867661.15940: in run() - task 0affcac9-a3a5-e081-a588-000000001d36 30575 1726867661.15951: variable 'ansible_search_path' from source: unknown 30575 1726867661.15954: variable 'ansible_search_path' from source: unknown 30575 1726867661.15985: calling self._execute() 30575 1726867661.16062: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.16066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.16076: variable 'omit' from source: magic vars 30575 1726867661.16351: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.16360: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867661.16446: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867661.16583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867661.18082: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867661.18118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867661.18143: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867661.18169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867661.18198: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867661.18254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.18292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.18307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.18334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.18345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.18376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.18401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.18417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.18441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.18451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.18481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.18498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.18521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.18543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.18554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.18667: variable 'network_connections' from source: include params 30575 1726867661.18676: variable 'interface' from source: play vars 30575 1726867661.18723: variable 'interface' from source: play vars 30575 1726867661.18772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867661.18883: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867661.18912: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867661.18936: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867661.18960: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867661.18994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867661.19009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867661.19028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.19046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867661.19088: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867661.19237: variable 'network_connections' from source: include params 30575 1726867661.19240: variable 'interface' from source: play vars 30575 1726867661.19286: variable 'interface' from source: play vars 30575 1726867661.19305: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867661.19309: when evaluation is False, skipping this task 30575 1726867661.19311: _execute() done 30575 1726867661.19314: dumping result to json 30575 1726867661.19319: done dumping result, returning 30575 1726867661.19324: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000001d36] 30575 1726867661.19329: sending task result for task 0affcac9-a3a5-e081-a588-000000001d36 30575 1726867661.19413: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d36 30575 1726867661.19425: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867661.19469: no more pending results, returning what we have 30575 1726867661.19472: results queue empty 30575 1726867661.19473: checking for any_errors_fatal 30575 1726867661.19480: done checking for any_errors_fatal 30575 1726867661.19481: checking for max_fail_percentage 30575 1726867661.19482: done checking for max_fail_percentage 30575 1726867661.19483: checking to see if all hosts have failed and the running result is not ok 30575 1726867661.19485: done checking to see if all hosts have failed 30575 1726867661.19486: getting the remaining hosts for this loop 30575 1726867661.19487: done getting the remaining hosts for this loop 30575 1726867661.19491: getting the next task for host managed_node3 30575 1726867661.19500: done getting next task for host managed_node3 30575 1726867661.19504: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867661.19509: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867661.19535: getting variables 30575 1726867661.19536: in VariableManager get_vars() 30575 1726867661.19574: Calling all_inventory to load vars for managed_node3 30575 1726867661.19576: Calling groups_inventory to load vars for managed_node3 30575 1726867661.19580: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867661.19589: Calling all_plugins_play to load vars for managed_node3 30575 1726867661.19591: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867661.19594: Calling groups_plugins_play to load vars for managed_node3 30575 1726867661.20415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867661.21411: done with get_vars() 30575 1726867661.21430: done getting variables 30575 1726867661.21469: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:41 -0400 (0:00:00.060) 0:01:36.592 ****** 30575 1726867661.21493: entering _queue_task() for managed_node3/service 30575 1726867661.21706: worker is 1 (out of 1 available) 30575 1726867661.21719: exiting _queue_task() for managed_node3/service 30575 1726867661.21731: done queuing things up, now waiting for results queue to drain 30575 1726867661.21732: waiting for pending results... 30575 1726867661.21922: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867661.22019: in run() - task 0affcac9-a3a5-e081-a588-000000001d37 30575 1726867661.22032: variable 'ansible_search_path' from source: unknown 30575 1726867661.22036: variable 'ansible_search_path' from source: unknown 30575 1726867661.22064: calling self._execute() 30575 1726867661.22143: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.22147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.22154: variable 'omit' from source: magic vars 30575 1726867661.22439: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.22448: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867661.22560: variable 'network_provider' from source: set_fact 30575 1726867661.22564: variable 'network_state' from source: role '' defaults 30575 1726867661.22575: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867661.22580: variable 'omit' from source: magic vars 30575 1726867661.22635: variable 'omit' from source: magic vars 30575 1726867661.22654: variable 'network_service_name' from source: role '' defaults 30575 1726867661.22701: variable 'network_service_name' from source: role '' defaults 30575 1726867661.22776: variable '__network_provider_setup' from source: role '' defaults 30575 1726867661.22781: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867661.22827: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867661.22843: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867661.22882: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867661.23030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867661.24486: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867661.24529: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867661.24555: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867661.24594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867661.24614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867661.24670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.24695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.24713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.24740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.24751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.24781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.24801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.24819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.24845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.24855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.25005: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867661.25078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.25095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.25111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.25144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.25155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.25213: variable 'ansible_python' from source: facts 30575 1726867661.25236: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867661.25285: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867661.25346: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867661.25422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.25441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.25461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.25487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.25498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.25532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.25551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.25572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.25598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.25608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.25703: variable 'network_connections' from source: include params 30575 1726867661.25708: variable 'interface' from source: play vars 30575 1726867661.25760: variable 'interface' from source: play vars 30575 1726867661.25835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867661.25953: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867661.25995: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867661.26025: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867661.26054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867661.26113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867661.26134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867661.26156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.26180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867661.26223: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867661.26408: variable 'network_connections' from source: include params 30575 1726867661.26413: variable 'interface' from source: play vars 30575 1726867661.26469: variable 'interface' from source: play vars 30575 1726867661.26493: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867661.26547: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867661.26732: variable 'network_connections' from source: include params 30575 1726867661.26735: variable 'interface' from source: play vars 30575 1726867661.26789: variable 'interface' from source: play vars 30575 1726867661.26805: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867661.26857: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867661.27043: variable 'network_connections' from source: include params 30575 1726867661.27046: variable 'interface' from source: play vars 30575 1726867661.27098: variable 'interface' from source: play vars 30575 1726867661.27133: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867661.27173: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867661.27180: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867661.27226: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867661.27357: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867661.27655: variable 'network_connections' from source: include params 30575 1726867661.27658: variable 'interface' from source: play vars 30575 1726867661.27701: variable 'interface' from source: play vars 30575 1726867661.27707: variable 'ansible_distribution' from source: facts 30575 1726867661.27710: variable '__network_rh_distros' from source: role '' defaults 30575 1726867661.27719: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.27728: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867661.27857: variable 'ansible_distribution' from source: facts 30575 1726867661.27987: variable '__network_rh_distros' from source: role '' defaults 30575 1726867661.27991: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.27994: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867661.28051: variable 'ansible_distribution' from source: facts 30575 1726867661.28062: variable '__network_rh_distros' from source: role '' defaults 30575 1726867661.28072: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.28128: variable 'network_provider' from source: set_fact 30575 1726867661.28154: variable 'omit' from source: magic vars 30575 1726867661.28186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867661.28217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867661.28241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867661.28263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867661.28282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867661.28314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867661.28324: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.28331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.28426: Set connection var ansible_pipelining to False 30575 1726867661.28434: Set connection var ansible_shell_type to sh 30575 1726867661.28446: Set connection var ansible_shell_executable to /bin/sh 30575 1726867661.28455: Set connection var ansible_timeout to 10 30575 1726867661.28464: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867661.28474: Set connection var ansible_connection to ssh 30575 1726867661.28504: variable 'ansible_shell_executable' from source: unknown 30575 1726867661.28511: variable 'ansible_connection' from source: unknown 30575 1726867661.28519: variable 'ansible_module_compression' from source: unknown 30575 1726867661.28682: variable 'ansible_shell_type' from source: unknown 30575 1726867661.28685: variable 'ansible_shell_executable' from source: unknown 30575 1726867661.28687: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.28689: variable 'ansible_pipelining' from source: unknown 30575 1726867661.28691: variable 'ansible_timeout' from source: unknown 30575 1726867661.28693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.28696: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867661.28702: variable 'omit' from source: magic vars 30575 1726867661.28704: starting attempt loop 30575 1726867661.28706: running the handler 30575 1726867661.28750: variable 'ansible_facts' from source: unknown 30575 1726867661.29297: _low_level_execute_command(): starting 30575 1726867661.29304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867661.29779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867661.29783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867661.29786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867661.29788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867661.29838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867661.29841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867661.29843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867661.29914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867661.31604: stdout chunk (state=3): >>>/root <<< 30575 1726867661.31742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867661.31745: stdout chunk (state=3): >>><<< 30575 1726867661.31748: stderr chunk (state=3): >>><<< 30575 1726867661.31763: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867661.31783: _low_level_execute_command(): starting 30575 1726867661.31792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619 `" && echo ansible-tmp-1726867661.3176963-35148-45528866707619="` echo /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619 `" ) && sleep 0' 30575 1726867661.32367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867661.32385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867661.32401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867661.32425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867661.32444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867661.32456: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867661.32472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867661.32499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867661.32544: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867661.32603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867661.32624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867661.32649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867661.32720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867661.34619: stdout chunk (state=3): >>>ansible-tmp-1726867661.3176963-35148-45528866707619=/root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619 <<< 30575 1726867661.34745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867661.34756: stdout chunk (state=3): >>><<< 30575 1726867661.34761: stderr chunk (state=3): >>><<< 30575 1726867661.34772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867661.3176963-35148-45528866707619=/root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867661.34798: variable 'ansible_module_compression' from source: unknown 30575 1726867661.34838: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867661.34892: variable 'ansible_facts' from source: unknown 30575 1726867661.35026: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/AnsiballZ_systemd.py 30575 1726867661.35135: Sending initial data 30575 1726867661.35139: Sent initial data (155 bytes) 30575 1726867661.35793: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867661.35857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867661.35871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867661.35892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867661.35966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867661.37501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867661.37505: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867661.37547: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867661.37590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvltwblo4 /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/AnsiballZ_systemd.py <<< 30575 1726867661.37593: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/AnsiballZ_systemd.py" <<< 30575 1726867661.37633: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvltwblo4" to remote "/root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/AnsiballZ_systemd.py" <<< 30575 1726867661.39033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867661.39036: stderr chunk (state=3): >>><<< 30575 1726867661.39039: stdout chunk (state=3): >>><<< 30575 1726867661.39041: done transferring module to remote 30575 1726867661.39042: _low_level_execute_command(): starting 30575 1726867661.39045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/ /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/AnsiballZ_systemd.py && sleep 0' 30575 1726867661.39445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867661.39449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867661.39451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867661.39468: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867661.39470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867661.39528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867661.39532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867661.39580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867661.41544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867661.41548: stdout chunk (state=3): >>><<< 30575 1726867661.41550: stderr chunk (state=3): >>><<< 30575 1726867661.41553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867661.41555: _low_level_execute_command(): starting 30575 1726867661.41557: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/AnsiballZ_systemd.py && sleep 0' 30575 1726867661.42093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867661.42141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867661.42187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867661.71487: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316097024", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1954189000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867661.73279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867661.73284: stderr chunk (state=3): >>><<< 30575 1726867661.73286: stdout chunk (state=3): >>><<< 30575 1726867661.73313: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316097024", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1954189000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867661.73768: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867661.73806: _low_level_execute_command(): starting 30575 1726867661.73863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867661.3176963-35148-45528866707619/ > /dev/null 2>&1 && sleep 0' 30575 1726867661.75075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867661.75095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867661.75146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867661.75221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867661.75253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867661.75282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867661.75348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867661.77541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867661.77545: stdout chunk (state=3): >>><<< 30575 1726867661.77548: stderr chunk (state=3): >>><<< 30575 1726867661.77550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867661.77552: handler run complete 30575 1726867661.77554: attempt loop complete, returning result 30575 1726867661.77556: _execute() done 30575 1726867661.77558: dumping result to json 30575 1726867661.77659: done dumping result, returning 30575 1726867661.77787: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000001d37] 30575 1726867661.77790: sending task result for task 0affcac9-a3a5-e081-a588-000000001d37 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867661.78439: no more pending results, returning what we have 30575 1726867661.78443: results queue empty 30575 1726867661.78444: checking for any_errors_fatal 30575 1726867661.78452: done checking for any_errors_fatal 30575 1726867661.78453: checking for max_fail_percentage 30575 1726867661.78454: done checking for max_fail_percentage 30575 1726867661.78455: checking to see if all hosts have failed and the running result is not ok 30575 1726867661.78456: done checking to see if all hosts have failed 30575 1726867661.78457: getting the remaining hosts for this loop 30575 1726867661.78458: done getting the remaining hosts for this loop 30575 1726867661.78463: getting the next task for host managed_node3 30575 1726867661.78472: done getting next task for host managed_node3 30575 1726867661.78476: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867661.78524: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867661.78542: getting variables 30575 1726867661.78544: in VariableManager get_vars() 30575 1726867661.78698: Calling all_inventory to load vars for managed_node3 30575 1726867661.78701: Calling groups_inventory to load vars for managed_node3 30575 1726867661.78705: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867661.78716: Calling all_plugins_play to load vars for managed_node3 30575 1726867661.78719: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867661.78723: Calling groups_plugins_play to load vars for managed_node3 30575 1726867661.79363: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d37 30575 1726867661.79366: WORKER PROCESS EXITING 30575 1726867661.81222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867661.84865: done with get_vars() 30575 1726867661.84963: done getting variables 30575 1726867661.85105: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:41 -0400 (0:00:00.636) 0:01:37.229 ****** 30575 1726867661.85193: entering _queue_task() for managed_node3/service 30575 1726867661.86159: worker is 1 (out of 1 available) 30575 1726867661.86172: exiting _queue_task() for managed_node3/service 30575 1726867661.86226: done queuing things up, now waiting for results queue to drain 30575 1726867661.86228: waiting for pending results... 30575 1726867661.86600: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867661.86935: in run() - task 0affcac9-a3a5-e081-a588-000000001d38 30575 1726867661.86956: variable 'ansible_search_path' from source: unknown 30575 1726867661.86963: variable 'ansible_search_path' from source: unknown 30575 1726867661.87007: calling self._execute() 30575 1726867661.87289: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867661.87300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867661.87490: variable 'omit' from source: magic vars 30575 1726867661.88247: variable 'ansible_distribution_major_version' from source: facts 30575 1726867661.88263: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867661.88375: variable 'network_provider' from source: set_fact 30575 1726867661.88782: Evaluated conditional (network_provider == "nm"): True 30575 1726867661.88785: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867661.88787: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867661.89117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867661.94074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867661.94256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867661.94327: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867661.94421: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867661.94699: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867661.95589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.95632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.95661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.95775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.95796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.95893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.95979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.96007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.96049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.96090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.96139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867661.96168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867661.96198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.96246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867661.96317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867661.96644: variable 'network_connections' from source: include params 30575 1726867661.96647: variable 'interface' from source: play vars 30575 1726867661.96800: variable 'interface' from source: play vars 30575 1726867661.96919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867661.97117: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867661.97159: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867661.97215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867661.97248: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867661.97308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867661.97337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867661.97367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867661.97414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867661.97465: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867661.97715: variable 'network_connections' from source: include params 30575 1726867661.97733: variable 'interface' from source: play vars 30575 1726867661.97796: variable 'interface' from source: play vars 30575 1726867661.97828: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867661.97844: when evaluation is False, skipping this task 30575 1726867661.97851: _execute() done 30575 1726867661.97857: dumping result to json 30575 1726867661.97863: done dumping result, returning 30575 1726867661.97874: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000001d38] 30575 1726867661.97949: sending task result for task 0affcac9-a3a5-e081-a588-000000001d38 30575 1726867661.98018: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d38 30575 1726867661.98021: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867661.98102: no more pending results, returning what we have 30575 1726867661.98106: results queue empty 30575 1726867661.98107: checking for any_errors_fatal 30575 1726867661.98124: done checking for any_errors_fatal 30575 1726867661.98125: checking for max_fail_percentage 30575 1726867661.98127: done checking for max_fail_percentage 30575 1726867661.98128: checking to see if all hosts have failed and the running result is not ok 30575 1726867661.98129: done checking to see if all hosts have failed 30575 1726867661.98130: getting the remaining hosts for this loop 30575 1726867661.98131: done getting the remaining hosts for this loop 30575 1726867661.98136: getting the next task for host managed_node3 30575 1726867661.98146: done getting next task for host managed_node3 30575 1726867661.98150: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867661.98155: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867661.98191: getting variables 30575 1726867661.98193: in VariableManager get_vars() 30575 1726867661.98238: Calling all_inventory to load vars for managed_node3 30575 1726867661.98241: Calling groups_inventory to load vars for managed_node3 30575 1726867661.98243: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867661.98254: Calling all_plugins_play to load vars for managed_node3 30575 1726867661.98258: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867661.98261: Calling groups_plugins_play to load vars for managed_node3 30575 1726867662.00283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867662.01854: done with get_vars() 30575 1726867662.01883: done getting variables 30575 1726867662.01942: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:42 -0400 (0:00:00.167) 0:01:37.397 ****** 30575 1726867662.01986: entering _queue_task() for managed_node3/service 30575 1726867662.02402: worker is 1 (out of 1 available) 30575 1726867662.02417: exiting _queue_task() for managed_node3/service 30575 1726867662.02427: done queuing things up, now waiting for results queue to drain 30575 1726867662.02429: waiting for pending results... 30575 1726867662.02698: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867662.02883: in run() - task 0affcac9-a3a5-e081-a588-000000001d39 30575 1726867662.02887: variable 'ansible_search_path' from source: unknown 30575 1726867662.02889: variable 'ansible_search_path' from source: unknown 30575 1726867662.02958: calling self._execute() 30575 1726867662.03048: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.03065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.03081: variable 'omit' from source: magic vars 30575 1726867662.03500: variable 'ansible_distribution_major_version' from source: facts 30575 1726867662.03504: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867662.03615: variable 'network_provider' from source: set_fact 30575 1726867662.03626: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867662.03633: when evaluation is False, skipping this task 30575 1726867662.03639: _execute() done 30575 1726867662.03661: dumping result to json 30575 1726867662.03664: done dumping result, returning 30575 1726867662.03666: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000001d39] 30575 1726867662.03675: sending task result for task 0affcac9-a3a5-e081-a588-000000001d39 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867662.03934: no more pending results, returning what we have 30575 1726867662.03939: results queue empty 30575 1726867662.03939: checking for any_errors_fatal 30575 1726867662.03950: done checking for any_errors_fatal 30575 1726867662.03951: checking for max_fail_percentage 30575 1726867662.03953: done checking for max_fail_percentage 30575 1726867662.03954: checking to see if all hosts have failed and the running result is not ok 30575 1726867662.03955: done checking to see if all hosts have failed 30575 1726867662.03955: getting the remaining hosts for this loop 30575 1726867662.03957: done getting the remaining hosts for this loop 30575 1726867662.03961: getting the next task for host managed_node3 30575 1726867662.03970: done getting next task for host managed_node3 30575 1726867662.03975: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867662.03983: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867662.04015: getting variables 30575 1726867662.04017: in VariableManager get_vars() 30575 1726867662.04064: Calling all_inventory to load vars for managed_node3 30575 1726867662.04067: Calling groups_inventory to load vars for managed_node3 30575 1726867662.04070: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867662.04284: Calling all_plugins_play to load vars for managed_node3 30575 1726867662.04288: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867662.04293: Calling groups_plugins_play to load vars for managed_node3 30575 1726867662.04919: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d39 30575 1726867662.04922: WORKER PROCESS EXITING 30575 1726867662.05653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867662.07268: done with get_vars() 30575 1726867662.07305: done getting variables 30575 1726867662.07369: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:42 -0400 (0:00:00.054) 0:01:37.451 ****** 30575 1726867662.07413: entering _queue_task() for managed_node3/copy 30575 1726867662.07887: worker is 1 (out of 1 available) 30575 1726867662.07899: exiting _queue_task() for managed_node3/copy 30575 1726867662.07911: done queuing things up, now waiting for results queue to drain 30575 1726867662.07912: waiting for pending results... 30575 1726867662.08114: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867662.08283: in run() - task 0affcac9-a3a5-e081-a588-000000001d3a 30575 1726867662.08308: variable 'ansible_search_path' from source: unknown 30575 1726867662.08316: variable 'ansible_search_path' from source: unknown 30575 1726867662.08363: calling self._execute() 30575 1726867662.08473: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.08487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.08501: variable 'omit' from source: magic vars 30575 1726867662.08914: variable 'ansible_distribution_major_version' from source: facts 30575 1726867662.08936: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867662.09063: variable 'network_provider' from source: set_fact 30575 1726867662.09074: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867662.09082: when evaluation is False, skipping this task 30575 1726867662.09088: _execute() done 30575 1726867662.09095: dumping result to json 30575 1726867662.09101: done dumping result, returning 30575 1726867662.09111: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000001d3a] 30575 1726867662.09124: sending task result for task 0affcac9-a3a5-e081-a588-000000001d3a skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867662.09384: no more pending results, returning what we have 30575 1726867662.09389: results queue empty 30575 1726867662.09390: checking for any_errors_fatal 30575 1726867662.09397: done checking for any_errors_fatal 30575 1726867662.09398: checking for max_fail_percentage 30575 1726867662.09400: done checking for max_fail_percentage 30575 1726867662.09401: checking to see if all hosts have failed and the running result is not ok 30575 1726867662.09402: done checking to see if all hosts have failed 30575 1726867662.09403: getting the remaining hosts for this loop 30575 1726867662.09404: done getting the remaining hosts for this loop 30575 1726867662.09408: getting the next task for host managed_node3 30575 1726867662.09418: done getting next task for host managed_node3 30575 1726867662.09422: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867662.09427: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867662.09453: getting variables 30575 1726867662.09454: in VariableManager get_vars() 30575 1726867662.09501: Calling all_inventory to load vars for managed_node3 30575 1726867662.09504: Calling groups_inventory to load vars for managed_node3 30575 1726867662.09507: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867662.09519: Calling all_plugins_play to load vars for managed_node3 30575 1726867662.09523: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867662.09526: Calling groups_plugins_play to load vars for managed_node3 30575 1726867662.10092: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d3a 30575 1726867662.10095: WORKER PROCESS EXITING 30575 1726867662.11166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867662.12675: done with get_vars() 30575 1726867662.12697: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:42 -0400 (0:00:00.053) 0:01:37.505 ****** 30575 1726867662.12787: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867662.13202: worker is 1 (out of 1 available) 30575 1726867662.13213: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867662.13222: done queuing things up, now waiting for results queue to drain 30575 1726867662.13224: waiting for pending results... 30575 1726867662.13426: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867662.13589: in run() - task 0affcac9-a3a5-e081-a588-000000001d3b 30575 1726867662.13611: variable 'ansible_search_path' from source: unknown 30575 1726867662.13622: variable 'ansible_search_path' from source: unknown 30575 1726867662.13665: calling self._execute() 30575 1726867662.13773: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.13792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.13806: variable 'omit' from source: magic vars 30575 1726867662.14218: variable 'ansible_distribution_major_version' from source: facts 30575 1726867662.14222: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867662.14231: variable 'omit' from source: magic vars 30575 1726867662.14301: variable 'omit' from source: magic vars 30575 1726867662.14474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867662.16670: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867662.16746: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867662.16790: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867662.16835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867662.16982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867662.16986: variable 'network_provider' from source: set_fact 30575 1726867662.17085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867662.17125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867662.17156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867662.17202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867662.17232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867662.17310: variable 'omit' from source: magic vars 30575 1726867662.17435: variable 'omit' from source: magic vars 30575 1726867662.17543: variable 'network_connections' from source: include params 30575 1726867662.17559: variable 'interface' from source: play vars 30575 1726867662.17649: variable 'interface' from source: play vars 30575 1726867662.17787: variable 'omit' from source: magic vars 30575 1726867662.17800: variable '__lsr_ansible_managed' from source: task vars 30575 1726867662.17881: variable '__lsr_ansible_managed' from source: task vars 30575 1726867662.18062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867662.18287: Loaded config def from plugin (lookup/template) 30575 1726867662.18322: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867662.18340: File lookup term: get_ansible_managed.j2 30575 1726867662.18347: variable 'ansible_search_path' from source: unknown 30575 1726867662.18357: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867662.18412: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867662.18416: variable 'ansible_search_path' from source: unknown 30575 1726867662.29769: variable 'ansible_managed' from source: unknown 30575 1726867662.29849: variable 'omit' from source: magic vars 30575 1726867662.29868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867662.29886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867662.29896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867662.29907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867662.29917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867662.29930: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867662.29936: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.29940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.29998: Set connection var ansible_pipelining to False 30575 1726867662.30001: Set connection var ansible_shell_type to sh 30575 1726867662.30006: Set connection var ansible_shell_executable to /bin/sh 30575 1726867662.30011: Set connection var ansible_timeout to 10 30575 1726867662.30018: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867662.30022: Set connection var ansible_connection to ssh 30575 1726867662.30041: variable 'ansible_shell_executable' from source: unknown 30575 1726867662.30043: variable 'ansible_connection' from source: unknown 30575 1726867662.30046: variable 'ansible_module_compression' from source: unknown 30575 1726867662.30048: variable 'ansible_shell_type' from source: unknown 30575 1726867662.30058: variable 'ansible_shell_executable' from source: unknown 30575 1726867662.30062: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.30064: variable 'ansible_pipelining' from source: unknown 30575 1726867662.30066: variable 'ansible_timeout' from source: unknown 30575 1726867662.30068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.30163: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867662.30174: variable 'omit' from source: magic vars 30575 1726867662.30179: starting attempt loop 30575 1726867662.30182: running the handler 30575 1726867662.30185: _low_level_execute_command(): starting 30575 1726867662.30187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867662.30683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867662.30726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867662.30729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.30731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867662.30733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.30771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867662.30787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867662.30844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867662.32624: stdout chunk (state=3): >>>/root <<< 30575 1726867662.32660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867662.32714: stderr chunk (state=3): >>><<< 30575 1726867662.32720: stdout chunk (state=3): >>><<< 30575 1726867662.32723: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867662.32736: _low_level_execute_command(): starting 30575 1726867662.32741: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162 `" && echo ansible-tmp-1726867662.3272376-35192-215950863270162="` echo /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162 `" ) && sleep 0' 30575 1726867662.33342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.33401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867662.33405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867662.33409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867662.33452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867662.35363: stdout chunk (state=3): >>>ansible-tmp-1726867662.3272376-35192-215950863270162=/root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162 <<< 30575 1726867662.35469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867662.35492: stderr chunk (state=3): >>><<< 30575 1726867662.35495: stdout chunk (state=3): >>><<< 30575 1726867662.35508: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867662.3272376-35192-215950863270162=/root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867662.35543: variable 'ansible_module_compression' from source: unknown 30575 1726867662.35572: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867662.35606: variable 'ansible_facts' from source: unknown 30575 1726867662.35695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/AnsiballZ_network_connections.py 30575 1726867662.35787: Sending initial data 30575 1726867662.35791: Sent initial data (168 bytes) 30575 1726867662.36209: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867662.36212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.36218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867662.36221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867662.36223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.36261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867662.36264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867662.36315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867662.37884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867662.37887: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867662.37924: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867662.37972: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp5dip4y60 /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/AnsiballZ_network_connections.py <<< 30575 1726867662.37974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/AnsiballZ_network_connections.py" <<< 30575 1726867662.38014: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp5dip4y60" to remote "/root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/AnsiballZ_network_connections.py" <<< 30575 1726867662.38846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867662.38871: stderr chunk (state=3): >>><<< 30575 1726867662.38913: stdout chunk (state=3): >>><<< 30575 1726867662.38919: done transferring module to remote 30575 1726867662.38931: _low_level_execute_command(): starting 30575 1726867662.38940: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/ /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/AnsiballZ_network_connections.py && sleep 0' 30575 1726867662.39570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867662.39590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867662.39603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867662.39679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867662.41470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867662.41500: stderr chunk (state=3): >>><<< 30575 1726867662.41503: stdout chunk (state=3): >>><<< 30575 1726867662.41519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867662.41523: _low_level_execute_command(): starting 30575 1726867662.41525: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/AnsiballZ_network_connections.py && sleep 0' 30575 1726867662.41915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867662.41921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867662.41923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.41925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867662.41927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.41981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867662.41985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867662.42047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867662.74632: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867662.76532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867662.76551: stderr chunk (state=3): >>><<< 30575 1726867662.76554: stdout chunk (state=3): >>><<< 30575 1726867662.76569: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867662.76598: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867662.76605: _low_level_execute_command(): starting 30575 1726867662.76614: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867662.3272376-35192-215950863270162/ > /dev/null 2>&1 && sleep 0' 30575 1726867662.77046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867662.77050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867662.77052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.77054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867662.77056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867662.77058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867662.77104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867662.77108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867662.77158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867662.79084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867662.79087: stdout chunk (state=3): >>><<< 30575 1726867662.79093: stderr chunk (state=3): >>><<< 30575 1726867662.79099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867662.79127: handler run complete 30575 1726867662.79130: attempt loop complete, returning result 30575 1726867662.79132: _execute() done 30575 1726867662.79134: dumping result to json 30575 1726867662.79136: done dumping result, returning 30575 1726867662.79138: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-000000001d3b] 30575 1726867662.79150: sending task result for task 0affcac9-a3a5-e081-a588-000000001d3b 30575 1726867662.79243: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d3b 30575 1726867662.79246: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30575 1726867662.79366: no more pending results, returning what we have 30575 1726867662.79369: results queue empty 30575 1726867662.79370: checking for any_errors_fatal 30575 1726867662.79376: done checking for any_errors_fatal 30575 1726867662.79376: checking for max_fail_percentage 30575 1726867662.79379: done checking for max_fail_percentage 30575 1726867662.79380: checking to see if all hosts have failed and the running result is not ok 30575 1726867662.79381: done checking to see if all hosts have failed 30575 1726867662.79382: getting the remaining hosts for this loop 30575 1726867662.79383: done getting the remaining hosts for this loop 30575 1726867662.79386: getting the next task for host managed_node3 30575 1726867662.79394: done getting next task for host managed_node3 30575 1726867662.79397: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867662.79402: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867662.79412: getting variables 30575 1726867662.79413: in VariableManager get_vars() 30575 1726867662.79450: Calling all_inventory to load vars for managed_node3 30575 1726867662.79452: Calling groups_inventory to load vars for managed_node3 30575 1726867662.79454: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867662.79462: Calling all_plugins_play to load vars for managed_node3 30575 1726867662.79464: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867662.79467: Calling groups_plugins_play to load vars for managed_node3 30575 1726867662.80391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867662.81354: done with get_vars() 30575 1726867662.81370: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:42 -0400 (0:00:00.686) 0:01:38.191 ****** 30575 1726867662.81434: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867662.81678: worker is 1 (out of 1 available) 30575 1726867662.81694: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867662.81708: done queuing things up, now waiting for results queue to drain 30575 1726867662.81709: waiting for pending results... 30575 1726867662.82092: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867662.82097: in run() - task 0affcac9-a3a5-e081-a588-000000001d3c 30575 1726867662.82100: variable 'ansible_search_path' from source: unknown 30575 1726867662.82102: variable 'ansible_search_path' from source: unknown 30575 1726867662.82135: calling self._execute() 30575 1726867662.82238: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.82248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.82263: variable 'omit' from source: magic vars 30575 1726867662.82635: variable 'ansible_distribution_major_version' from source: facts 30575 1726867662.82649: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867662.82769: variable 'network_state' from source: role '' defaults 30575 1726867662.82786: Evaluated conditional (network_state != {}): False 30575 1726867662.82795: when evaluation is False, skipping this task 30575 1726867662.82798: _execute() done 30575 1726867662.82801: dumping result to json 30575 1726867662.82804: done dumping result, returning 30575 1726867662.82813: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000001d3c] 30575 1726867662.82825: sending task result for task 0affcac9-a3a5-e081-a588-000000001d3c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867662.82966: no more pending results, returning what we have 30575 1726867662.82971: results queue empty 30575 1726867662.82971: checking for any_errors_fatal 30575 1726867662.82987: done checking for any_errors_fatal 30575 1726867662.82988: checking for max_fail_percentage 30575 1726867662.82990: done checking for max_fail_percentage 30575 1726867662.82991: checking to see if all hosts have failed and the running result is not ok 30575 1726867662.82992: done checking to see if all hosts have failed 30575 1726867662.82992: getting the remaining hosts for this loop 30575 1726867662.82994: done getting the remaining hosts for this loop 30575 1726867662.82997: getting the next task for host managed_node3 30575 1726867662.83010: done getting next task for host managed_node3 30575 1726867662.83014: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867662.83020: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867662.83030: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d3c 30575 1726867662.83033: WORKER PROCESS EXITING 30575 1726867662.83053: getting variables 30575 1726867662.83055: in VariableManager get_vars() 30575 1726867662.83248: Calling all_inventory to load vars for managed_node3 30575 1726867662.83251: Calling groups_inventory to load vars for managed_node3 30575 1726867662.83253: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867662.83261: Calling all_plugins_play to load vars for managed_node3 30575 1726867662.83263: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867662.83266: Calling groups_plugins_play to load vars for managed_node3 30575 1726867662.90343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867662.91855: done with get_vars() 30575 1726867662.91880: done getting variables 30575 1726867662.91934: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:42 -0400 (0:00:00.105) 0:01:38.297 ****** 30575 1726867662.91965: entering _queue_task() for managed_node3/debug 30575 1726867662.92345: worker is 1 (out of 1 available) 30575 1726867662.92358: exiting _queue_task() for managed_node3/debug 30575 1726867662.92371: done queuing things up, now waiting for results queue to drain 30575 1726867662.92373: waiting for pending results... 30575 1726867662.92684: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867662.92858: in run() - task 0affcac9-a3a5-e081-a588-000000001d3d 30575 1726867662.92882: variable 'ansible_search_path' from source: unknown 30575 1726867662.92892: variable 'ansible_search_path' from source: unknown 30575 1726867662.92942: calling self._execute() 30575 1726867662.93052: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.93065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.93183: variable 'omit' from source: magic vars 30575 1726867662.93494: variable 'ansible_distribution_major_version' from source: facts 30575 1726867662.93519: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867662.93533: variable 'omit' from source: magic vars 30575 1726867662.93600: variable 'omit' from source: magic vars 30575 1726867662.93646: variable 'omit' from source: magic vars 30575 1726867662.93694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867662.93741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867662.93766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867662.93790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867662.93808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867662.93850: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867662.93860: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.93950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.93982: Set connection var ansible_pipelining to False 30575 1726867662.93992: Set connection var ansible_shell_type to sh 30575 1726867662.94004: Set connection var ansible_shell_executable to /bin/sh 30575 1726867662.94019: Set connection var ansible_timeout to 10 30575 1726867662.94030: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867662.94043: Set connection var ansible_connection to ssh 30575 1726867662.94079: variable 'ansible_shell_executable' from source: unknown 30575 1726867662.94087: variable 'ansible_connection' from source: unknown 30575 1726867662.94095: variable 'ansible_module_compression' from source: unknown 30575 1726867662.94102: variable 'ansible_shell_type' from source: unknown 30575 1726867662.94109: variable 'ansible_shell_executable' from source: unknown 30575 1726867662.94118: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.94127: variable 'ansible_pipelining' from source: unknown 30575 1726867662.94134: variable 'ansible_timeout' from source: unknown 30575 1726867662.94143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.94294: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867662.94385: variable 'omit' from source: magic vars 30575 1726867662.94388: starting attempt loop 30575 1726867662.94391: running the handler 30575 1726867662.94456: variable '__network_connections_result' from source: set_fact 30575 1726867662.94517: handler run complete 30575 1726867662.94541: attempt loop complete, returning result 30575 1726867662.94549: _execute() done 30575 1726867662.94555: dumping result to json 30575 1726867662.94600: done dumping result, returning 30575 1726867662.94604: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-000000001d3d] 30575 1726867662.94606: sending task result for task 0affcac9-a3a5-e081-a588-000000001d3d ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30575 1726867662.94949: no more pending results, returning what we have 30575 1726867662.94952: results queue empty 30575 1726867662.94953: checking for any_errors_fatal 30575 1726867662.94960: done checking for any_errors_fatal 30575 1726867662.94961: checking for max_fail_percentage 30575 1726867662.94963: done checking for max_fail_percentage 30575 1726867662.94964: checking to see if all hosts have failed and the running result is not ok 30575 1726867662.94965: done checking to see if all hosts have failed 30575 1726867662.94965: getting the remaining hosts for this loop 30575 1726867662.94967: done getting the remaining hosts for this loop 30575 1726867662.94970: getting the next task for host managed_node3 30575 1726867662.94980: done getting next task for host managed_node3 30575 1726867662.94985: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867662.94989: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867662.95002: getting variables 30575 1726867662.95004: in VariableManager get_vars() 30575 1726867662.95049: Calling all_inventory to load vars for managed_node3 30575 1726867662.95052: Calling groups_inventory to load vars for managed_node3 30575 1726867662.95054: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867662.95064: Calling all_plugins_play to load vars for managed_node3 30575 1726867662.95067: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867662.95070: Calling groups_plugins_play to load vars for managed_node3 30575 1726867662.95596: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d3d 30575 1726867662.95599: WORKER PROCESS EXITING 30575 1726867662.96561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867662.97574: done with get_vars() 30575 1726867662.97591: done getting variables 30575 1726867662.97635: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:42 -0400 (0:00:00.056) 0:01:38.354 ****** 30575 1726867662.97663: entering _queue_task() for managed_node3/debug 30575 1726867662.97898: worker is 1 (out of 1 available) 30575 1726867662.97910: exiting _queue_task() for managed_node3/debug 30575 1726867662.97925: done queuing things up, now waiting for results queue to drain 30575 1726867662.97927: waiting for pending results... 30575 1726867662.98117: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867662.98223: in run() - task 0affcac9-a3a5-e081-a588-000000001d3e 30575 1726867662.98236: variable 'ansible_search_path' from source: unknown 30575 1726867662.98240: variable 'ansible_search_path' from source: unknown 30575 1726867662.98270: calling self._execute() 30575 1726867662.98346: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.98350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.98359: variable 'omit' from source: magic vars 30575 1726867662.98882: variable 'ansible_distribution_major_version' from source: facts 30575 1726867662.98886: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867662.98889: variable 'omit' from source: magic vars 30575 1726867662.98892: variable 'omit' from source: magic vars 30575 1726867662.98897: variable 'omit' from source: magic vars 30575 1726867662.98907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867662.98950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867662.98972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867662.98997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867662.99021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867662.99052: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867662.99060: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.99067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.99174: Set connection var ansible_pipelining to False 30575 1726867662.99185: Set connection var ansible_shell_type to sh 30575 1726867662.99195: Set connection var ansible_shell_executable to /bin/sh 30575 1726867662.99204: Set connection var ansible_timeout to 10 30575 1726867662.99213: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867662.99231: Set connection var ansible_connection to ssh 30575 1726867662.99259: variable 'ansible_shell_executable' from source: unknown 30575 1726867662.99269: variable 'ansible_connection' from source: unknown 30575 1726867662.99279: variable 'ansible_module_compression' from source: unknown 30575 1726867662.99287: variable 'ansible_shell_type' from source: unknown 30575 1726867662.99294: variable 'ansible_shell_executable' from source: unknown 30575 1726867662.99302: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867662.99309: variable 'ansible_pipelining' from source: unknown 30575 1726867662.99319: variable 'ansible_timeout' from source: unknown 30575 1726867662.99327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867662.99481: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867662.99504: variable 'omit' from source: magic vars 30575 1726867662.99534: starting attempt loop 30575 1726867662.99536: running the handler 30575 1726867662.99570: variable '__network_connections_result' from source: set_fact 30575 1726867662.99632: variable '__network_connections_result' from source: set_fact 30575 1726867662.99713: handler run complete 30575 1726867662.99732: attempt loop complete, returning result 30575 1726867662.99735: _execute() done 30575 1726867662.99738: dumping result to json 30575 1726867662.99741: done dumping result, returning 30575 1726867662.99748: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-000000001d3e] 30575 1726867662.99753: sending task result for task 0affcac9-a3a5-e081-a588-000000001d3e 30575 1726867662.99841: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d3e 30575 1726867662.99844: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30575 1726867662.99955: no more pending results, returning what we have 30575 1726867662.99958: results queue empty 30575 1726867662.99958: checking for any_errors_fatal 30575 1726867662.99963: done checking for any_errors_fatal 30575 1726867662.99963: checking for max_fail_percentage 30575 1726867662.99965: done checking for max_fail_percentage 30575 1726867662.99965: checking to see if all hosts have failed and the running result is not ok 30575 1726867662.99966: done checking to see if all hosts have failed 30575 1726867662.99967: getting the remaining hosts for this loop 30575 1726867662.99968: done getting the remaining hosts for this loop 30575 1726867662.99971: getting the next task for host managed_node3 30575 1726867662.99979: done getting next task for host managed_node3 30575 1726867662.99983: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867662.99987: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867662.99997: getting variables 30575 1726867662.99998: in VariableManager get_vars() 30575 1726867663.00031: Calling all_inventory to load vars for managed_node3 30575 1726867663.00034: Calling groups_inventory to load vars for managed_node3 30575 1726867663.00036: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.00048: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.00050: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.00053: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.00803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.01910: done with get_vars() 30575 1726867663.01938: done getting variables 30575 1726867663.01997: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:43 -0400 (0:00:00.043) 0:01:38.397 ****** 30575 1726867663.02042: entering _queue_task() for managed_node3/debug 30575 1726867663.02460: worker is 1 (out of 1 available) 30575 1726867663.02471: exiting _queue_task() for managed_node3/debug 30575 1726867663.02484: done queuing things up, now waiting for results queue to drain 30575 1726867663.02486: waiting for pending results... 30575 1726867663.02692: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867663.02859: in run() - task 0affcac9-a3a5-e081-a588-000000001d3f 30575 1726867663.02883: variable 'ansible_search_path' from source: unknown 30575 1726867663.02887: variable 'ansible_search_path' from source: unknown 30575 1726867663.02929: calling self._execute() 30575 1726867663.03015: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.03020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.03040: variable 'omit' from source: magic vars 30575 1726867663.03316: variable 'ansible_distribution_major_version' from source: facts 30575 1726867663.03328: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867663.03412: variable 'network_state' from source: role '' defaults 30575 1726867663.03422: Evaluated conditional (network_state != {}): False 30575 1726867663.03426: when evaluation is False, skipping this task 30575 1726867663.03429: _execute() done 30575 1726867663.03433: dumping result to json 30575 1726867663.03435: done dumping result, returning 30575 1726867663.03446: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-000000001d3f] 30575 1726867663.03449: sending task result for task 0affcac9-a3a5-e081-a588-000000001d3f 30575 1726867663.03531: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d3f 30575 1726867663.03534: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867663.03617: no more pending results, returning what we have 30575 1726867663.03621: results queue empty 30575 1726867663.03621: checking for any_errors_fatal 30575 1726867663.03626: done checking for any_errors_fatal 30575 1726867663.03627: checking for max_fail_percentage 30575 1726867663.03628: done checking for max_fail_percentage 30575 1726867663.03629: checking to see if all hosts have failed and the running result is not ok 30575 1726867663.03630: done checking to see if all hosts have failed 30575 1726867663.03630: getting the remaining hosts for this loop 30575 1726867663.03631: done getting the remaining hosts for this loop 30575 1726867663.03634: getting the next task for host managed_node3 30575 1726867663.03641: done getting next task for host managed_node3 30575 1726867663.03644: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867663.03649: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867663.03675: getting variables 30575 1726867663.03678: in VariableManager get_vars() 30575 1726867663.03711: Calling all_inventory to load vars for managed_node3 30575 1726867663.03713: Calling groups_inventory to load vars for managed_node3 30575 1726867663.03714: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.03720: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.03722: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.03724: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.04726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.06152: done with get_vars() 30575 1726867663.06171: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:43 -0400 (0:00:00.042) 0:01:38.440 ****** 30575 1726867663.06308: entering _queue_task() for managed_node3/ping 30575 1726867663.06594: worker is 1 (out of 1 available) 30575 1726867663.06605: exiting _queue_task() for managed_node3/ping 30575 1726867663.06615: done queuing things up, now waiting for results queue to drain 30575 1726867663.06617: waiting for pending results... 30575 1726867663.06942: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867663.07026: in run() - task 0affcac9-a3a5-e081-a588-000000001d40 30575 1726867663.07038: variable 'ansible_search_path' from source: unknown 30575 1726867663.07041: variable 'ansible_search_path' from source: unknown 30575 1726867663.07070: calling self._execute() 30575 1726867663.07157: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.07163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.07172: variable 'omit' from source: magic vars 30575 1726867663.07457: variable 'ansible_distribution_major_version' from source: facts 30575 1726867663.07467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867663.07473: variable 'omit' from source: magic vars 30575 1726867663.07527: variable 'omit' from source: magic vars 30575 1726867663.07549: variable 'omit' from source: magic vars 30575 1726867663.07583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867663.07609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867663.07627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867663.07640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867663.07650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867663.07678: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867663.07681: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.07684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.07753: Set connection var ansible_pipelining to False 30575 1726867663.07756: Set connection var ansible_shell_type to sh 30575 1726867663.07761: Set connection var ansible_shell_executable to /bin/sh 30575 1726867663.07766: Set connection var ansible_timeout to 10 30575 1726867663.07771: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867663.07783: Set connection var ansible_connection to ssh 30575 1726867663.07803: variable 'ansible_shell_executable' from source: unknown 30575 1726867663.07806: variable 'ansible_connection' from source: unknown 30575 1726867663.07808: variable 'ansible_module_compression' from source: unknown 30575 1726867663.07811: variable 'ansible_shell_type' from source: unknown 30575 1726867663.07813: variable 'ansible_shell_executable' from source: unknown 30575 1726867663.07815: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.07821: variable 'ansible_pipelining' from source: unknown 30575 1726867663.07824: variable 'ansible_timeout' from source: unknown 30575 1726867663.07827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.07973: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867663.07991: variable 'omit' from source: magic vars 30575 1726867663.07998: starting attempt loop 30575 1726867663.08000: running the handler 30575 1726867663.08018: _low_level_execute_command(): starting 30575 1726867663.08021: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867663.08522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.08525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867663.08529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867663.08531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.08585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867663.08592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.08594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.08654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.10330: stdout chunk (state=3): >>>/root <<< 30575 1726867663.10445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.10449: stdout chunk (state=3): >>><<< 30575 1726867663.10457: stderr chunk (state=3): >>><<< 30575 1726867663.10475: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.10488: _low_level_execute_command(): starting 30575 1726867663.10493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622 `" && echo ansible-tmp-1726867663.104742-35236-273979386903622="` echo /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622 `" ) && sleep 0' 30575 1726867663.10913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.10916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867663.10919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867663.10927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867663.10929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.10968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867663.10972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.11023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.12920: stdout chunk (state=3): >>>ansible-tmp-1726867663.104742-35236-273979386903622=/root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622 <<< 30575 1726867663.13028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.13046: stderr chunk (state=3): >>><<< 30575 1726867663.13051: stdout chunk (state=3): >>><<< 30575 1726867663.13067: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867663.104742-35236-273979386903622=/root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.13100: variable 'ansible_module_compression' from source: unknown 30575 1726867663.13133: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867663.13170: variable 'ansible_facts' from source: unknown 30575 1726867663.13214: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/AnsiballZ_ping.py 30575 1726867663.13306: Sending initial data 30575 1726867663.13310: Sent initial data (152 bytes) 30575 1726867663.13726: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.13729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.13731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867663.13733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867663.13735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.13766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.13783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.13826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.15360: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867663.15364: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867663.15401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867663.15452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmphhb3lm6o /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/AnsiballZ_ping.py <<< 30575 1726867663.15455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/AnsiballZ_ping.py" <<< 30575 1726867663.15492: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmphhb3lm6o" to remote "/root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/AnsiballZ_ping.py" <<< 30575 1726867663.16007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.16041: stderr chunk (state=3): >>><<< 30575 1726867663.16045: stdout chunk (state=3): >>><<< 30575 1726867663.16083: done transferring module to remote 30575 1726867663.16091: _low_level_execute_command(): starting 30575 1726867663.16094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/ /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/AnsiballZ_ping.py && sleep 0' 30575 1726867663.16498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.16502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.16504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867663.16506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867663.16511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.16562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.16565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.16605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.18333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.18353: stderr chunk (state=3): >>><<< 30575 1726867663.18356: stdout chunk (state=3): >>><<< 30575 1726867663.18367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.18369: _low_level_execute_command(): starting 30575 1726867663.18374: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/AnsiballZ_ping.py && sleep 0' 30575 1726867663.18748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.18751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867663.18781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867663.18784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867663.18786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.18789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.18832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.18844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.18900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.36711: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867663.37948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867663.37973: stderr chunk (state=3): >>><<< 30575 1726867663.37978: stdout chunk (state=3): >>><<< 30575 1726867663.37995: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867663.38022: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867663.38028: _low_level_execute_command(): starting 30575 1726867663.38034: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867663.104742-35236-273979386903622/ > /dev/null 2>&1 && sleep 0' 30575 1726867663.38532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867663.38535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.38537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867663.38540: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.38542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867663.38543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.38598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867663.38605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.38607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.38648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.40688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.40691: stdout chunk (state=3): >>><<< 30575 1726867663.40693: stderr chunk (state=3): >>><<< 30575 1726867663.40695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.40698: handler run complete 30575 1726867663.40699: attempt loop complete, returning result 30575 1726867663.40701: _execute() done 30575 1726867663.40702: dumping result to json 30575 1726867663.40704: done dumping result, returning 30575 1726867663.40705: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-000000001d40] 30575 1726867663.40707: sending task result for task 0affcac9-a3a5-e081-a588-000000001d40 30575 1726867663.40767: done sending task result for task 0affcac9-a3a5-e081-a588-000000001d40 30575 1726867663.40770: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867663.40854: no more pending results, returning what we have 30575 1726867663.40858: results queue empty 30575 1726867663.40858: checking for any_errors_fatal 30575 1726867663.40865: done checking for any_errors_fatal 30575 1726867663.40865: checking for max_fail_percentage 30575 1726867663.40867: done checking for max_fail_percentage 30575 1726867663.40867: checking to see if all hosts have failed and the running result is not ok 30575 1726867663.40868: done checking to see if all hosts have failed 30575 1726867663.40869: getting the remaining hosts for this loop 30575 1726867663.40870: done getting the remaining hosts for this loop 30575 1726867663.40873: getting the next task for host managed_node3 30575 1726867663.40885: done getting next task for host managed_node3 30575 1726867663.40898: ^ task is: TASK: meta (role_complete) 30575 1726867663.40903: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867663.40918: getting variables 30575 1726867663.40920: in VariableManager get_vars() 30575 1726867663.40962: Calling all_inventory to load vars for managed_node3 30575 1726867663.40964: Calling groups_inventory to load vars for managed_node3 30575 1726867663.40966: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.40975: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.41040: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.41046: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.41927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.42827: done with get_vars() 30575 1726867663.42843: done getting variables 30575 1726867663.42905: done queuing things up, now waiting for results queue to drain 30575 1726867663.42907: results queue empty 30575 1726867663.42907: checking for any_errors_fatal 30575 1726867663.42909: done checking for any_errors_fatal 30575 1726867663.42909: checking for max_fail_percentage 30575 1726867663.42910: done checking for max_fail_percentage 30575 1726867663.42910: checking to see if all hosts have failed and the running result is not ok 30575 1726867663.42911: done checking to see if all hosts have failed 30575 1726867663.42911: getting the remaining hosts for this loop 30575 1726867663.42912: done getting the remaining hosts for this loop 30575 1726867663.42913: getting the next task for host managed_node3 30575 1726867663.42919: done getting next task for host managed_node3 30575 1726867663.42921: ^ task is: TASK: Asserts 30575 1726867663.42922: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867663.42925: getting variables 30575 1726867663.42925: in VariableManager get_vars() 30575 1726867663.42934: Calling all_inventory to load vars for managed_node3 30575 1726867663.42935: Calling groups_inventory to load vars for managed_node3 30575 1726867663.42936: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.42939: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.42940: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.42942: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.44158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.45747: done with get_vars() 30575 1726867663.45767: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:27:43 -0400 (0:00:00.395) 0:01:38.835 ****** 30575 1726867663.45843: entering _queue_task() for managed_node3/include_tasks 30575 1726867663.46252: worker is 1 (out of 1 available) 30575 1726867663.46265: exiting _queue_task() for managed_node3/include_tasks 30575 1726867663.46480: done queuing things up, now waiting for results queue to drain 30575 1726867663.46482: waiting for pending results... 30575 1726867663.46584: running TaskExecutor() for managed_node3/TASK: Asserts 30575 1726867663.46729: in run() - task 0affcac9-a3a5-e081-a588-000000001749 30575 1726867663.46748: variable 'ansible_search_path' from source: unknown 30575 1726867663.46755: variable 'ansible_search_path' from source: unknown 30575 1726867663.46809: variable 'lsr_assert' from source: include params 30575 1726867663.47055: variable 'lsr_assert' from source: include params 30575 1726867663.47143: variable 'omit' from source: magic vars 30575 1726867663.47276: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.47367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.47371: variable 'omit' from source: magic vars 30575 1726867663.47572: variable 'ansible_distribution_major_version' from source: facts 30575 1726867663.47593: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867663.47605: variable 'item' from source: unknown 30575 1726867663.47679: variable 'item' from source: unknown 30575 1726867663.47725: variable 'item' from source: unknown 30575 1726867663.47789: variable 'item' from source: unknown 30575 1726867663.48186: dumping result to json 30575 1726867663.48190: done dumping result, returning 30575 1726867663.48192: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-e081-a588-000000001749] 30575 1726867663.48195: sending task result for task 0affcac9-a3a5-e081-a588-000000001749 30575 1726867663.48242: done sending task result for task 0affcac9-a3a5-e081-a588-000000001749 30575 1726867663.48246: WORKER PROCESS EXITING 30575 1726867663.48310: no more pending results, returning what we have 30575 1726867663.48317: in VariableManager get_vars() 30575 1726867663.48361: Calling all_inventory to load vars for managed_node3 30575 1726867663.48364: Calling groups_inventory to load vars for managed_node3 30575 1726867663.48368: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.48383: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.48386: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.48390: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.49794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.51475: done with get_vars() 30575 1726867663.51494: variable 'ansible_search_path' from source: unknown 30575 1726867663.51496: variable 'ansible_search_path' from source: unknown 30575 1726867663.51536: we have included files to process 30575 1726867663.51538: generating all_blocks data 30575 1726867663.51540: done generating all_blocks data 30575 1726867663.51547: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867663.51548: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867663.51550: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867663.51656: in VariableManager get_vars() 30575 1726867663.51682: done with get_vars() 30575 1726867663.51796: done processing included file 30575 1726867663.51798: iterating over new_blocks loaded from include file 30575 1726867663.51800: in VariableManager get_vars() 30575 1726867663.51818: done with get_vars() 30575 1726867663.51819: filtering new block on tags 30575 1726867663.51854: done filtering new block on tags 30575 1726867663.51857: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=tasks/assert_profile_absent.yml) 30575 1726867663.51861: extending task lists for all hosts with included blocks 30575 1726867663.52907: done extending task lists 30575 1726867663.52909: done processing included files 30575 1726867663.52910: results queue empty 30575 1726867663.52910: checking for any_errors_fatal 30575 1726867663.52912: done checking for any_errors_fatal 30575 1726867663.52912: checking for max_fail_percentage 30575 1726867663.52913: done checking for max_fail_percentage 30575 1726867663.52914: checking to see if all hosts have failed and the running result is not ok 30575 1726867663.52918: done checking to see if all hosts have failed 30575 1726867663.52919: getting the remaining hosts for this loop 30575 1726867663.52920: done getting the remaining hosts for this loop 30575 1726867663.52922: getting the next task for host managed_node3 30575 1726867663.52927: done getting next task for host managed_node3 30575 1726867663.52929: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30575 1726867663.52932: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867663.52937: getting variables 30575 1726867663.52938: in VariableManager get_vars() 30575 1726867663.52949: Calling all_inventory to load vars for managed_node3 30575 1726867663.52951: Calling groups_inventory to load vars for managed_node3 30575 1726867663.52953: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.52959: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.52961: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.52964: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.53667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.54521: done with get_vars() 30575 1726867663.54536: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 17:27:43 -0400 (0:00:00.087) 0:01:38.923 ****** 30575 1726867663.54588: entering _queue_task() for managed_node3/include_tasks 30575 1726867663.54885: worker is 1 (out of 1 available) 30575 1726867663.54898: exiting _queue_task() for managed_node3/include_tasks 30575 1726867663.54909: done queuing things up, now waiting for results queue to drain 30575 1726867663.54911: waiting for pending results... 30575 1726867663.55297: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30575 1726867663.55351: in run() - task 0affcac9-a3a5-e081-a588-000000001e99 30575 1726867663.55370: variable 'ansible_search_path' from source: unknown 30575 1726867663.55379: variable 'ansible_search_path' from source: unknown 30575 1726867663.55425: calling self._execute() 30575 1726867663.55529: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.55542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.55557: variable 'omit' from source: magic vars 30575 1726867663.55943: variable 'ansible_distribution_major_version' from source: facts 30575 1726867663.55960: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867663.55963: _execute() done 30575 1726867663.55966: dumping result to json 30575 1726867663.55970: done dumping result, returning 30575 1726867663.55979: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-e081-a588-000000001e99] 30575 1726867663.55985: sending task result for task 0affcac9-a3a5-e081-a588-000000001e99 30575 1726867663.56069: done sending task result for task 0affcac9-a3a5-e081-a588-000000001e99 30575 1726867663.56071: WORKER PROCESS EXITING 30575 1726867663.56099: no more pending results, returning what we have 30575 1726867663.56104: in VariableManager get_vars() 30575 1726867663.56150: Calling all_inventory to load vars for managed_node3 30575 1726867663.56153: Calling groups_inventory to load vars for managed_node3 30575 1726867663.56156: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.56168: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.56171: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.56173: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.57060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.58067: done with get_vars() 30575 1726867663.58088: variable 'ansible_search_path' from source: unknown 30575 1726867663.58089: variable 'ansible_search_path' from source: unknown 30575 1726867663.58097: variable 'item' from source: include params 30575 1726867663.58198: variable 'item' from source: include params 30575 1726867663.58233: we have included files to process 30575 1726867663.58234: generating all_blocks data 30575 1726867663.58236: done generating all_blocks data 30575 1726867663.58237: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867663.58238: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867663.58241: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867663.59123: done processing included file 30575 1726867663.59126: iterating over new_blocks loaded from include file 30575 1726867663.59127: in VariableManager get_vars() 30575 1726867663.59144: done with get_vars() 30575 1726867663.59146: filtering new block on tags 30575 1726867663.59214: done filtering new block on tags 30575 1726867663.59217: in VariableManager get_vars() 30575 1726867663.59233: done with get_vars() 30575 1726867663.59235: filtering new block on tags 30575 1726867663.59295: done filtering new block on tags 30575 1726867663.59297: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30575 1726867663.59304: extending task lists for all hosts with included blocks 30575 1726867663.59475: done extending task lists 30575 1726867663.59476: done processing included files 30575 1726867663.59476: results queue empty 30575 1726867663.59479: checking for any_errors_fatal 30575 1726867663.59482: done checking for any_errors_fatal 30575 1726867663.59482: checking for max_fail_percentage 30575 1726867663.59483: done checking for max_fail_percentage 30575 1726867663.59484: checking to see if all hosts have failed and the running result is not ok 30575 1726867663.59484: done checking to see if all hosts have failed 30575 1726867663.59485: getting the remaining hosts for this loop 30575 1726867663.59485: done getting the remaining hosts for this loop 30575 1726867663.59487: getting the next task for host managed_node3 30575 1726867663.59490: done getting next task for host managed_node3 30575 1726867663.59492: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867663.59494: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867663.59496: getting variables 30575 1726867663.59496: in VariableManager get_vars() 30575 1726867663.59503: Calling all_inventory to load vars for managed_node3 30575 1726867663.59505: Calling groups_inventory to load vars for managed_node3 30575 1726867663.59507: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.59511: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.59512: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.59514: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.60132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.61100: done with get_vars() 30575 1726867663.61118: done getting variables 30575 1726867663.61151: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:27:43 -0400 (0:00:00.065) 0:01:38.989 ****** 30575 1726867663.61181: entering _queue_task() for managed_node3/set_fact 30575 1726867663.61539: worker is 1 (out of 1 available) 30575 1726867663.61552: exiting _queue_task() for managed_node3/set_fact 30575 1726867663.61567: done queuing things up, now waiting for results queue to drain 30575 1726867663.61569: waiting for pending results... 30575 1726867663.61901: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867663.61951: in run() - task 0affcac9-a3a5-e081-a588-000000001f17 30575 1726867663.61969: variable 'ansible_search_path' from source: unknown 30575 1726867663.61972: variable 'ansible_search_path' from source: unknown 30575 1726867663.62012: calling self._execute() 30575 1726867663.62111: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.62116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.62131: variable 'omit' from source: magic vars 30575 1726867663.62503: variable 'ansible_distribution_major_version' from source: facts 30575 1726867663.62516: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867663.62524: variable 'omit' from source: magic vars 30575 1726867663.62568: variable 'omit' from source: magic vars 30575 1726867663.62603: variable 'omit' from source: magic vars 30575 1726867663.62651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867663.62699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867663.62702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867663.62712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867663.62728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867663.62902: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867663.62905: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.62907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.62910: Set connection var ansible_pipelining to False 30575 1726867663.62912: Set connection var ansible_shell_type to sh 30575 1726867663.62914: Set connection var ansible_shell_executable to /bin/sh 30575 1726867663.62916: Set connection var ansible_timeout to 10 30575 1726867663.62918: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867663.62920: Set connection var ansible_connection to ssh 30575 1726867663.62930: variable 'ansible_shell_executable' from source: unknown 30575 1726867663.62933: variable 'ansible_connection' from source: unknown 30575 1726867663.62935: variable 'ansible_module_compression' from source: unknown 30575 1726867663.62938: variable 'ansible_shell_type' from source: unknown 30575 1726867663.62941: variable 'ansible_shell_executable' from source: unknown 30575 1726867663.62943: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.62945: variable 'ansible_pipelining' from source: unknown 30575 1726867663.62948: variable 'ansible_timeout' from source: unknown 30575 1726867663.62950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.63088: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867663.63100: variable 'omit' from source: magic vars 30575 1726867663.63105: starting attempt loop 30575 1726867663.63108: running the handler 30575 1726867663.63124: handler run complete 30575 1726867663.63136: attempt loop complete, returning result 30575 1726867663.63147: _execute() done 30575 1726867663.63150: dumping result to json 30575 1726867663.63153: done dumping result, returning 30575 1726867663.63155: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-e081-a588-000000001f17] 30575 1726867663.63157: sending task result for task 0affcac9-a3a5-e081-a588-000000001f17 30575 1726867663.63236: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f17 30575 1726867663.63239: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30575 1726867663.63317: no more pending results, returning what we have 30575 1726867663.63321: results queue empty 30575 1726867663.63321: checking for any_errors_fatal 30575 1726867663.63322: done checking for any_errors_fatal 30575 1726867663.63323: checking for max_fail_percentage 30575 1726867663.63325: done checking for max_fail_percentage 30575 1726867663.63326: checking to see if all hosts have failed and the running result is not ok 30575 1726867663.63327: done checking to see if all hosts have failed 30575 1726867663.63327: getting the remaining hosts for this loop 30575 1726867663.63329: done getting the remaining hosts for this loop 30575 1726867663.63332: getting the next task for host managed_node3 30575 1726867663.63341: done getting next task for host managed_node3 30575 1726867663.63343: ^ task is: TASK: Stat profile file 30575 1726867663.63348: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867663.63352: getting variables 30575 1726867663.63353: in VariableManager get_vars() 30575 1726867663.63511: Calling all_inventory to load vars for managed_node3 30575 1726867663.63514: Calling groups_inventory to load vars for managed_node3 30575 1726867663.63517: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.63525: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.63528: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.63530: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.64522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867663.65386: done with get_vars() 30575 1726867663.65400: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:27:43 -0400 (0:00:00.042) 0:01:39.032 ****** 30575 1726867663.65463: entering _queue_task() for managed_node3/stat 30575 1726867663.65689: worker is 1 (out of 1 available) 30575 1726867663.65704: exiting _queue_task() for managed_node3/stat 30575 1726867663.65716: done queuing things up, now waiting for results queue to drain 30575 1726867663.65718: waiting for pending results... 30575 1726867663.66097: running TaskExecutor() for managed_node3/TASK: Stat profile file 30575 1726867663.66108: in run() - task 0affcac9-a3a5-e081-a588-000000001f18 30575 1726867663.66130: variable 'ansible_search_path' from source: unknown 30575 1726867663.66138: variable 'ansible_search_path' from source: unknown 30575 1726867663.66176: calling self._execute() 30575 1726867663.66276: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.66291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.66324: variable 'omit' from source: magic vars 30575 1726867663.66748: variable 'ansible_distribution_major_version' from source: facts 30575 1726867663.66758: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867663.66764: variable 'omit' from source: magic vars 30575 1726867663.66802: variable 'omit' from source: magic vars 30575 1726867663.66872: variable 'profile' from source: play vars 30575 1726867663.66875: variable 'interface' from source: play vars 30575 1726867663.66928: variable 'interface' from source: play vars 30575 1726867663.66941: variable 'omit' from source: magic vars 30575 1726867663.66974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867663.67004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867663.67020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867663.67034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867663.67044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867663.67069: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867663.67072: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.67074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.67145: Set connection var ansible_pipelining to False 30575 1726867663.67148: Set connection var ansible_shell_type to sh 30575 1726867663.67153: Set connection var ansible_shell_executable to /bin/sh 30575 1726867663.67158: Set connection var ansible_timeout to 10 30575 1726867663.67163: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867663.67176: Set connection var ansible_connection to ssh 30575 1726867663.67191: variable 'ansible_shell_executable' from source: unknown 30575 1726867663.67195: variable 'ansible_connection' from source: unknown 30575 1726867663.67197: variable 'ansible_module_compression' from source: unknown 30575 1726867663.67200: variable 'ansible_shell_type' from source: unknown 30575 1726867663.67203: variable 'ansible_shell_executable' from source: unknown 30575 1726867663.67205: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867663.67207: variable 'ansible_pipelining' from source: unknown 30575 1726867663.67210: variable 'ansible_timeout' from source: unknown 30575 1726867663.67214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867663.67361: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867663.67370: variable 'omit' from source: magic vars 30575 1726867663.67375: starting attempt loop 30575 1726867663.67382: running the handler 30575 1726867663.67393: _low_level_execute_command(): starting 30575 1726867663.67402: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867663.67884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.67888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867663.67891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867663.67894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.67940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.67948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.68001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.69680: stdout chunk (state=3): >>>/root <<< 30575 1726867663.69781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.69804: stderr chunk (state=3): >>><<< 30575 1726867663.69809: stdout chunk (state=3): >>><<< 30575 1726867663.69828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.69839: _low_level_execute_command(): starting 30575 1726867663.69845: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700 `" && echo ansible-tmp-1726867663.6982768-35258-130919295613700="` echo /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700 `" ) && sleep 0' 30575 1726867663.70236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867663.70268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867663.70272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867663.70285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867663.70287: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.70290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.70334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867663.70337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.70341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.70388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.72257: stdout chunk (state=3): >>>ansible-tmp-1726867663.6982768-35258-130919295613700=/root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700 <<< 30575 1726867663.72369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.72395: stderr chunk (state=3): >>><<< 30575 1726867663.72400: stdout chunk (state=3): >>><<< 30575 1726867663.72412: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867663.6982768-35258-130919295613700=/root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.72450: variable 'ansible_module_compression' from source: unknown 30575 1726867663.72493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867663.72528: variable 'ansible_facts' from source: unknown 30575 1726867663.72580: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/AnsiballZ_stat.py 30575 1726867663.72675: Sending initial data 30575 1726867663.72680: Sent initial data (153 bytes) 30575 1726867663.73074: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867663.73115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.73118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867663.73120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.73123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867663.73124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867663.73126: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.73172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867663.73180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.73224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.74755: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867663.74762: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867663.74801: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867663.74846: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpnzoozk8m /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/AnsiballZ_stat.py <<< 30575 1726867663.74851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/AnsiballZ_stat.py" <<< 30575 1726867663.74893: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpnzoozk8m" to remote "/root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/AnsiballZ_stat.py" <<< 30575 1726867663.74899: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/AnsiballZ_stat.py" <<< 30575 1726867663.75423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.75458: stderr chunk (state=3): >>><<< 30575 1726867663.75461: stdout chunk (state=3): >>><<< 30575 1726867663.75478: done transferring module to remote 30575 1726867663.75484: _low_level_execute_command(): starting 30575 1726867663.75488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/ /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/AnsiballZ_stat.py && sleep 0' 30575 1726867663.75894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.75898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.75903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867663.75906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.75952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867663.75955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.76007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.77725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.77747: stderr chunk (state=3): >>><<< 30575 1726867663.77751: stdout chunk (state=3): >>><<< 30575 1726867663.77764: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.77767: _low_level_execute_command(): starting 30575 1726867663.77769: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/AnsiballZ_stat.py && sleep 0' 30575 1726867663.78161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.78170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.78191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.78237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.78249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.78305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.93533: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867663.94782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867663.94811: stderr chunk (state=3): >>><<< 30575 1726867663.94814: stdout chunk (state=3): >>><<< 30575 1726867663.94832: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867663.94859: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867663.94868: _low_level_execute_command(): starting 30575 1726867663.94872: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867663.6982768-35258-130919295613700/ > /dev/null 2>&1 && sleep 0' 30575 1726867663.95331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867663.95334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867663.95336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.95339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867663.95347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867663.95397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867663.95404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867663.95406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867663.95447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867663.97270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867663.97294: stderr chunk (state=3): >>><<< 30575 1726867663.97297: stdout chunk (state=3): >>><<< 30575 1726867663.97310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867663.97316: handler run complete 30575 1726867663.97353: attempt loop complete, returning result 30575 1726867663.97356: _execute() done 30575 1726867663.97359: dumping result to json 30575 1726867663.97361: done dumping result, returning 30575 1726867663.97483: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-e081-a588-000000001f18] 30575 1726867663.97486: sending task result for task 0affcac9-a3a5-e081-a588-000000001f18 30575 1726867663.97551: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f18 30575 1726867663.97553: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867663.97632: no more pending results, returning what we have 30575 1726867663.97635: results queue empty 30575 1726867663.97636: checking for any_errors_fatal 30575 1726867663.97643: done checking for any_errors_fatal 30575 1726867663.97643: checking for max_fail_percentage 30575 1726867663.97645: done checking for max_fail_percentage 30575 1726867663.97646: checking to see if all hosts have failed and the running result is not ok 30575 1726867663.97647: done checking to see if all hosts have failed 30575 1726867663.97647: getting the remaining hosts for this loop 30575 1726867663.97649: done getting the remaining hosts for this loop 30575 1726867663.97652: getting the next task for host managed_node3 30575 1726867663.97659: done getting next task for host managed_node3 30575 1726867663.97662: ^ task is: TASK: Set NM profile exist flag based on the profile files 30575 1726867663.97667: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867663.97670: getting variables 30575 1726867663.97672: in VariableManager get_vars() 30575 1726867663.97712: Calling all_inventory to load vars for managed_node3 30575 1726867663.97715: Calling groups_inventory to load vars for managed_node3 30575 1726867663.97718: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867663.97728: Calling all_plugins_play to load vars for managed_node3 30575 1726867663.97731: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867663.97733: Calling groups_plugins_play to load vars for managed_node3 30575 1726867663.99051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.00022: done with get_vars() 30575 1726867664.00037: done getting variables 30575 1726867664.00082: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:27:44 -0400 (0:00:00.346) 0:01:39.378 ****** 30575 1726867664.00108: entering _queue_task() for managed_node3/set_fact 30575 1726867664.00348: worker is 1 (out of 1 available) 30575 1726867664.00363: exiting _queue_task() for managed_node3/set_fact 30575 1726867664.00379: done queuing things up, now waiting for results queue to drain 30575 1726867664.00381: waiting for pending results... 30575 1726867664.00561: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30575 1726867664.00882: in run() - task 0affcac9-a3a5-e081-a588-000000001f19 30575 1726867664.00887: variable 'ansible_search_path' from source: unknown 30575 1726867664.00890: variable 'ansible_search_path' from source: unknown 30575 1726867664.00892: calling self._execute() 30575 1726867664.00894: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.00897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.00900: variable 'omit' from source: magic vars 30575 1726867664.01228: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.01244: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.01372: variable 'profile_stat' from source: set_fact 30575 1726867664.01389: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867664.01396: when evaluation is False, skipping this task 30575 1726867664.01401: _execute() done 30575 1726867664.01407: dumping result to json 30575 1726867664.01413: done dumping result, returning 30575 1726867664.01424: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-e081-a588-000000001f19] 30575 1726867664.01432: sending task result for task 0affcac9-a3a5-e081-a588-000000001f19 30575 1726867664.01524: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f19 30575 1726867664.01530: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867664.01695: no more pending results, returning what we have 30575 1726867664.01699: results queue empty 30575 1726867664.01699: checking for any_errors_fatal 30575 1726867664.01707: done checking for any_errors_fatal 30575 1726867664.01707: checking for max_fail_percentage 30575 1726867664.01709: done checking for max_fail_percentage 30575 1726867664.01710: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.01711: done checking to see if all hosts have failed 30575 1726867664.01711: getting the remaining hosts for this loop 30575 1726867664.01713: done getting the remaining hosts for this loop 30575 1726867664.01716: getting the next task for host managed_node3 30575 1726867664.01722: done getting next task for host managed_node3 30575 1726867664.01724: ^ task is: TASK: Get NM profile info 30575 1726867664.01729: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.01732: getting variables 30575 1726867664.01734: in VariableManager get_vars() 30575 1726867664.01772: Calling all_inventory to load vars for managed_node3 30575 1726867664.01775: Calling groups_inventory to load vars for managed_node3 30575 1726867664.01779: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.01788: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.01790: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.01792: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.03204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.04837: done with get_vars() 30575 1726867664.04857: done getting variables 30575 1726867664.04917: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:27:44 -0400 (0:00:00.048) 0:01:39.427 ****** 30575 1726867664.04951: entering _queue_task() for managed_node3/shell 30575 1726867664.05228: worker is 1 (out of 1 available) 30575 1726867664.05241: exiting _queue_task() for managed_node3/shell 30575 1726867664.05255: done queuing things up, now waiting for results queue to drain 30575 1726867664.05257: waiting for pending results... 30575 1726867664.05552: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30575 1726867664.05709: in run() - task 0affcac9-a3a5-e081-a588-000000001f1a 30575 1726867664.05735: variable 'ansible_search_path' from source: unknown 30575 1726867664.05743: variable 'ansible_search_path' from source: unknown 30575 1726867664.05788: calling self._execute() 30575 1726867664.05896: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.05909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.05931: variable 'omit' from source: magic vars 30575 1726867664.06333: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.06357: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.06369: variable 'omit' from source: magic vars 30575 1726867664.06431: variable 'omit' from source: magic vars 30575 1726867664.06544: variable 'profile' from source: play vars 30575 1726867664.06556: variable 'interface' from source: play vars 30575 1726867664.06627: variable 'interface' from source: play vars 30575 1726867664.06652: variable 'omit' from source: magic vars 30575 1726867664.06699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867664.06744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867664.06770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867664.06794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867664.06845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867664.06852: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867664.06859: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.06866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.06964: Set connection var ansible_pipelining to False 30575 1726867664.06972: Set connection var ansible_shell_type to sh 30575 1726867664.06984: Set connection var ansible_shell_executable to /bin/sh 30575 1726867664.06995: Set connection var ansible_timeout to 10 30575 1726867664.07062: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867664.07065: Set connection var ansible_connection to ssh 30575 1726867664.07067: variable 'ansible_shell_executable' from source: unknown 30575 1726867664.07069: variable 'ansible_connection' from source: unknown 30575 1726867664.07071: variable 'ansible_module_compression' from source: unknown 30575 1726867664.07073: variable 'ansible_shell_type' from source: unknown 30575 1726867664.07075: variable 'ansible_shell_executable' from source: unknown 30575 1726867664.07079: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.07081: variable 'ansible_pipelining' from source: unknown 30575 1726867664.07083: variable 'ansible_timeout' from source: unknown 30575 1726867664.07085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.07239: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867664.07255: variable 'omit' from source: magic vars 30575 1726867664.07266: starting attempt loop 30575 1726867664.07272: running the handler 30575 1726867664.07293: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867664.07318: _low_level_execute_command(): starting 30575 1726867664.07462: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867664.08070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867664.08130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.08197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867664.08213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.08244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.08325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.09960: stdout chunk (state=3): >>>/root <<< 30575 1726867664.10099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.10136: stdout chunk (state=3): >>><<< 30575 1726867664.10150: stderr chunk (state=3): >>><<< 30575 1726867664.10179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867664.10218: _low_level_execute_command(): starting 30575 1726867664.10223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537 `" && echo ansible-tmp-1726867664.1018822-35269-176482584631537="` echo /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537 `" ) && sleep 0' 30575 1726867664.11158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867664.11169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867664.11172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.11174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867664.11178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.11231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867664.11234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.11265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.11342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.13222: stdout chunk (state=3): >>>ansible-tmp-1726867664.1018822-35269-176482584631537=/root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537 <<< 30575 1726867664.13387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.13391: stdout chunk (state=3): >>><<< 30575 1726867664.13393: stderr chunk (state=3): >>><<< 30575 1726867664.13583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867664.1018822-35269-176482584631537=/root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867664.13587: variable 'ansible_module_compression' from source: unknown 30575 1726867664.13589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867664.13591: variable 'ansible_facts' from source: unknown 30575 1726867664.13668: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/AnsiballZ_command.py 30575 1726867664.13907: Sending initial data 30575 1726867664.13920: Sent initial data (156 bytes) 30575 1726867664.14519: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867664.14535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867664.14552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867664.14587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.14687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.14704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.14786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.16311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867664.16376: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867664.16419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4ur5hdfc /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/AnsiballZ_command.py <<< 30575 1726867664.16440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/AnsiballZ_command.py" <<< 30575 1726867664.16488: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp4ur5hdfc" to remote "/root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/AnsiballZ_command.py" <<< 30575 1726867664.17282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.17285: stdout chunk (state=3): >>><<< 30575 1726867664.17353: stderr chunk (state=3): >>><<< 30575 1726867664.17363: done transferring module to remote 30575 1726867664.17376: _low_level_execute_command(): starting 30575 1726867664.17391: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/ /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/AnsiballZ_command.py && sleep 0' 30575 1726867664.18037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867664.18051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867664.18064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867664.18084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867664.18126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867664.18142: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30575 1726867664.18237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867664.18251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.18271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.18347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.20166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.20182: stdout chunk (state=3): >>><<< 30575 1726867664.20198: stderr chunk (state=3): >>><<< 30575 1726867664.20225: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867664.20309: _low_level_execute_command(): starting 30575 1726867664.20313: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/AnsiballZ_command.py && sleep 0' 30575 1726867664.20826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867664.20832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867664.20865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867664.20868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.20870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867664.20872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867664.20886: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.20934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867664.20937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.20939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.20999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.37902: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:27:44.360301", "end": "2024-09-20 17:27:44.376797", "delta": "0:00:00.016496", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867664.39530: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867664.39534: stdout chunk (state=3): >>><<< 30575 1726867664.39536: stderr chunk (state=3): >>><<< 30575 1726867664.39540: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:27:44.360301", "end": "2024-09-20 17:27:44.376797", "delta": "0:00:00.016496", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867664.39543: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867664.39545: _low_level_execute_command(): starting 30575 1726867664.39547: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867664.1018822-35269-176482584631537/ > /dev/null 2>&1 && sleep 0' 30575 1726867664.40219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.40298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.40339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.40361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.40438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.42314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.42318: stdout chunk (state=3): >>><<< 30575 1726867664.42328: stderr chunk (state=3): >>><<< 30575 1726867664.42344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867664.42583: handler run complete 30575 1726867664.42586: Evaluated conditional (False): False 30575 1726867664.42588: attempt loop complete, returning result 30575 1726867664.42590: _execute() done 30575 1726867664.42593: dumping result to json 30575 1726867664.42595: done dumping result, returning 30575 1726867664.42598: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-e081-a588-000000001f1a] 30575 1726867664.42600: sending task result for task 0affcac9-a3a5-e081-a588-000000001f1a 30575 1726867664.42667: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f1a 30575 1726867664.42670: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016496", "end": "2024-09-20 17:27:44.376797", "rc": 1, "start": "2024-09-20 17:27:44.360301" } MSG: non-zero return code ...ignoring 30575 1726867664.42756: no more pending results, returning what we have 30575 1726867664.42760: results queue empty 30575 1726867664.42761: checking for any_errors_fatal 30575 1726867664.42769: done checking for any_errors_fatal 30575 1726867664.42770: checking for max_fail_percentage 30575 1726867664.42772: done checking for max_fail_percentage 30575 1726867664.42773: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.42774: done checking to see if all hosts have failed 30575 1726867664.42774: getting the remaining hosts for this loop 30575 1726867664.42776: done getting the remaining hosts for this loop 30575 1726867664.42781: getting the next task for host managed_node3 30575 1726867664.42789: done getting next task for host managed_node3 30575 1726867664.42792: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867664.42797: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.42801: getting variables 30575 1726867664.42802: in VariableManager get_vars() 30575 1726867664.42851: Calling all_inventory to load vars for managed_node3 30575 1726867664.42854: Calling groups_inventory to load vars for managed_node3 30575 1726867664.42858: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.42869: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.42873: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.42876: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.44032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.44975: done with get_vars() 30575 1726867664.45001: done getting variables 30575 1726867664.45066: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:27:44 -0400 (0:00:00.401) 0:01:39.828 ****** 30575 1726867664.45099: entering _queue_task() for managed_node3/set_fact 30575 1726867664.45455: worker is 1 (out of 1 available) 30575 1726867664.45470: exiting _queue_task() for managed_node3/set_fact 30575 1726867664.45685: done queuing things up, now waiting for results queue to drain 30575 1726867664.45687: waiting for pending results... 30575 1726867664.45790: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867664.45922: in run() - task 0affcac9-a3a5-e081-a588-000000001f1b 30575 1726867664.45957: variable 'ansible_search_path' from source: unknown 30575 1726867664.45968: variable 'ansible_search_path' from source: unknown 30575 1726867664.46002: calling self._execute() 30575 1726867664.46185: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.46190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.46193: variable 'omit' from source: magic vars 30575 1726867664.46584: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.46621: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.46762: variable 'nm_profile_exists' from source: set_fact 30575 1726867664.46783: Evaluated conditional (nm_profile_exists.rc == 0): False 30575 1726867664.46827: when evaluation is False, skipping this task 30575 1726867664.46831: _execute() done 30575 1726867664.46840: dumping result to json 30575 1726867664.46857: done dumping result, returning 30575 1726867664.46873: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-e081-a588-000000001f1b] 30575 1726867664.46876: sending task result for task 0affcac9-a3a5-e081-a588-000000001f1b 30575 1726867664.47035: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f1b 30575 1726867664.47038: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30575 1726867664.47089: no more pending results, returning what we have 30575 1726867664.47114: results queue empty 30575 1726867664.47115: checking for any_errors_fatal 30575 1726867664.47128: done checking for any_errors_fatal 30575 1726867664.47129: checking for max_fail_percentage 30575 1726867664.47130: done checking for max_fail_percentage 30575 1726867664.47131: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.47132: done checking to see if all hosts have failed 30575 1726867664.47133: getting the remaining hosts for this loop 30575 1726867664.47134: done getting the remaining hosts for this loop 30575 1726867664.47138: getting the next task for host managed_node3 30575 1726867664.47155: done getting next task for host managed_node3 30575 1726867664.47158: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867664.47163: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.47167: getting variables 30575 1726867664.47169: in VariableManager get_vars() 30575 1726867664.47271: Calling all_inventory to load vars for managed_node3 30575 1726867664.47274: Calling groups_inventory to load vars for managed_node3 30575 1726867664.47279: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.47289: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.47292: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.47295: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.48109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.48970: done with get_vars() 30575 1726867664.48999: done getting variables 30575 1726867664.49057: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867664.49180: variable 'profile' from source: play vars 30575 1726867664.49185: variable 'interface' from source: play vars 30575 1726867664.49245: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:27:44 -0400 (0:00:00.041) 0:01:39.870 ****** 30575 1726867664.49282: entering _queue_task() for managed_node3/command 30575 1726867664.49596: worker is 1 (out of 1 available) 30575 1726867664.49608: exiting _queue_task() for managed_node3/command 30575 1726867664.49621: done queuing things up, now waiting for results queue to drain 30575 1726867664.49623: waiting for pending results... 30575 1726867664.50009: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30575 1726867664.50069: in run() - task 0affcac9-a3a5-e081-a588-000000001f1d 30575 1726867664.50091: variable 'ansible_search_path' from source: unknown 30575 1726867664.50096: variable 'ansible_search_path' from source: unknown 30575 1726867664.50122: calling self._execute() 30575 1726867664.50193: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.50197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.50205: variable 'omit' from source: magic vars 30575 1726867664.50496: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.50505: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.50593: variable 'profile_stat' from source: set_fact 30575 1726867664.50597: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867664.50599: when evaluation is False, skipping this task 30575 1726867664.50604: _execute() done 30575 1726867664.50607: dumping result to json 30575 1726867664.50612: done dumping result, returning 30575 1726867664.50621: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000001f1d] 30575 1726867664.50626: sending task result for task 0affcac9-a3a5-e081-a588-000000001f1d 30575 1726867664.50710: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f1d 30575 1726867664.50713: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867664.50802: no more pending results, returning what we have 30575 1726867664.50805: results queue empty 30575 1726867664.50806: checking for any_errors_fatal 30575 1726867664.50812: done checking for any_errors_fatal 30575 1726867664.50812: checking for max_fail_percentage 30575 1726867664.50814: done checking for max_fail_percentage 30575 1726867664.50814: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.50815: done checking to see if all hosts have failed 30575 1726867664.50816: getting the remaining hosts for this loop 30575 1726867664.50817: done getting the remaining hosts for this loop 30575 1726867664.50822: getting the next task for host managed_node3 30575 1726867664.50829: done getting next task for host managed_node3 30575 1726867664.50832: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867664.50836: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.50839: getting variables 30575 1726867664.50840: in VariableManager get_vars() 30575 1726867664.50874: Calling all_inventory to load vars for managed_node3 30575 1726867664.50878: Calling groups_inventory to load vars for managed_node3 30575 1726867664.50881: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.50889: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.50892: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.50894: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.51767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.52620: done with get_vars() 30575 1726867664.52635: done getting variables 30575 1726867664.52678: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867664.52749: variable 'profile' from source: play vars 30575 1726867664.52752: variable 'interface' from source: play vars 30575 1726867664.52794: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:27:44 -0400 (0:00:00.035) 0:01:39.905 ****** 30575 1726867664.52817: entering _queue_task() for managed_node3/set_fact 30575 1726867664.53037: worker is 1 (out of 1 available) 30575 1726867664.53051: exiting _queue_task() for managed_node3/set_fact 30575 1726867664.53065: done queuing things up, now waiting for results queue to drain 30575 1726867664.53067: waiting for pending results... 30575 1726867664.53249: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30575 1726867664.53342: in run() - task 0affcac9-a3a5-e081-a588-000000001f1e 30575 1726867664.53353: variable 'ansible_search_path' from source: unknown 30575 1726867664.53357: variable 'ansible_search_path' from source: unknown 30575 1726867664.53389: calling self._execute() 30575 1726867664.53458: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.53461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.53471: variable 'omit' from source: magic vars 30575 1726867664.53741: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.53752: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.53837: variable 'profile_stat' from source: set_fact 30575 1726867664.53846: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867664.53849: when evaluation is False, skipping this task 30575 1726867664.53854: _execute() done 30575 1726867664.53856: dumping result to json 30575 1726867664.53859: done dumping result, returning 30575 1726867664.53868: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000001f1e] 30575 1726867664.53871: sending task result for task 0affcac9-a3a5-e081-a588-000000001f1e 30575 1726867664.53954: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f1e 30575 1726867664.53957: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867664.54004: no more pending results, returning what we have 30575 1726867664.54008: results queue empty 30575 1726867664.54009: checking for any_errors_fatal 30575 1726867664.54016: done checking for any_errors_fatal 30575 1726867664.54017: checking for max_fail_percentage 30575 1726867664.54018: done checking for max_fail_percentage 30575 1726867664.54019: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.54020: done checking to see if all hosts have failed 30575 1726867664.54021: getting the remaining hosts for this loop 30575 1726867664.54023: done getting the remaining hosts for this loop 30575 1726867664.54026: getting the next task for host managed_node3 30575 1726867664.54034: done getting next task for host managed_node3 30575 1726867664.54036: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30575 1726867664.54041: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.54044: getting variables 30575 1726867664.54045: in VariableManager get_vars() 30575 1726867664.54084: Calling all_inventory to load vars for managed_node3 30575 1726867664.54087: Calling groups_inventory to load vars for managed_node3 30575 1726867664.54090: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.54100: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.54102: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.54105: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.54870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.55838: done with get_vars() 30575 1726867664.55855: done getting variables 30575 1726867664.55903: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867664.55983: variable 'profile' from source: play vars 30575 1726867664.55986: variable 'interface' from source: play vars 30575 1726867664.56030: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:27:44 -0400 (0:00:00.032) 0:01:39.938 ****** 30575 1726867664.56054: entering _queue_task() for managed_node3/command 30575 1726867664.56335: worker is 1 (out of 1 available) 30575 1726867664.56348: exiting _queue_task() for managed_node3/command 30575 1726867664.56365: done queuing things up, now waiting for results queue to drain 30575 1726867664.56366: waiting for pending results... 30575 1726867664.56562: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30575 1726867664.56649: in run() - task 0affcac9-a3a5-e081-a588-000000001f1f 30575 1726867664.56660: variable 'ansible_search_path' from source: unknown 30575 1726867664.56664: variable 'ansible_search_path' from source: unknown 30575 1726867664.56693: calling self._execute() 30575 1726867664.56775: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.56781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.56791: variable 'omit' from source: magic vars 30575 1726867664.57075: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.57088: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.57180: variable 'profile_stat' from source: set_fact 30575 1726867664.57191: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867664.57194: when evaluation is False, skipping this task 30575 1726867664.57197: _execute() done 30575 1726867664.57199: dumping result to json 30575 1726867664.57202: done dumping result, returning 30575 1726867664.57211: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000001f1f] 30575 1726867664.57217: sending task result for task 0affcac9-a3a5-e081-a588-000000001f1f 30575 1726867664.57300: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f1f 30575 1726867664.57303: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867664.57354: no more pending results, returning what we have 30575 1726867664.57358: results queue empty 30575 1726867664.57359: checking for any_errors_fatal 30575 1726867664.57368: done checking for any_errors_fatal 30575 1726867664.57369: checking for max_fail_percentage 30575 1726867664.57371: done checking for max_fail_percentage 30575 1726867664.57372: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.57373: done checking to see if all hosts have failed 30575 1726867664.57373: getting the remaining hosts for this loop 30575 1726867664.57375: done getting the remaining hosts for this loop 30575 1726867664.57381: getting the next task for host managed_node3 30575 1726867664.57389: done getting next task for host managed_node3 30575 1726867664.57392: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30575 1726867664.57397: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.57401: getting variables 30575 1726867664.57403: in VariableManager get_vars() 30575 1726867664.57449: Calling all_inventory to load vars for managed_node3 30575 1726867664.57452: Calling groups_inventory to load vars for managed_node3 30575 1726867664.57455: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.57466: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.57469: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.57471: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.58292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.59148: done with get_vars() 30575 1726867664.59165: done getting variables 30575 1726867664.59210: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867664.59294: variable 'profile' from source: play vars 30575 1726867664.59297: variable 'interface' from source: play vars 30575 1726867664.59339: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:27:44 -0400 (0:00:00.033) 0:01:39.971 ****** 30575 1726867664.59363: entering _queue_task() for managed_node3/set_fact 30575 1726867664.59629: worker is 1 (out of 1 available) 30575 1726867664.59644: exiting _queue_task() for managed_node3/set_fact 30575 1726867664.59658: done queuing things up, now waiting for results queue to drain 30575 1726867664.59660: waiting for pending results... 30575 1726867664.59855: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30575 1726867664.59946: in run() - task 0affcac9-a3a5-e081-a588-000000001f20 30575 1726867664.59958: variable 'ansible_search_path' from source: unknown 30575 1726867664.59961: variable 'ansible_search_path' from source: unknown 30575 1726867664.59993: calling self._execute() 30575 1726867664.60071: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.60074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.60085: variable 'omit' from source: magic vars 30575 1726867664.60364: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.60374: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.60465: variable 'profile_stat' from source: set_fact 30575 1726867664.60474: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867664.60479: when evaluation is False, skipping this task 30575 1726867664.60482: _execute() done 30575 1726867664.60485: dumping result to json 30575 1726867664.60536: done dumping result, returning 30575 1726867664.60541: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000001f20] 30575 1726867664.60543: sending task result for task 0affcac9-a3a5-e081-a588-000000001f20 30575 1726867664.60611: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f20 30575 1726867664.60614: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867664.60686: no more pending results, returning what we have 30575 1726867664.60696: results queue empty 30575 1726867664.60697: checking for any_errors_fatal 30575 1726867664.60701: done checking for any_errors_fatal 30575 1726867664.60702: checking for max_fail_percentage 30575 1726867664.60703: done checking for max_fail_percentage 30575 1726867664.60704: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.60705: done checking to see if all hosts have failed 30575 1726867664.60706: getting the remaining hosts for this loop 30575 1726867664.60707: done getting the remaining hosts for this loop 30575 1726867664.60711: getting the next task for host managed_node3 30575 1726867664.60719: done getting next task for host managed_node3 30575 1726867664.60722: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30575 1726867664.60726: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.60729: getting variables 30575 1726867664.60731: in VariableManager get_vars() 30575 1726867664.60767: Calling all_inventory to load vars for managed_node3 30575 1726867664.60769: Calling groups_inventory to load vars for managed_node3 30575 1726867664.60772: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.60784: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.60787: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.60789: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.61709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.66795: done with get_vars() 30575 1726867664.66817: done getting variables 30575 1726867664.66852: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867664.66923: variable 'profile' from source: play vars 30575 1726867664.66926: variable 'interface' from source: play vars 30575 1726867664.66967: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 17:27:44 -0400 (0:00:00.076) 0:01:40.047 ****** 30575 1726867664.66990: entering _queue_task() for managed_node3/assert 30575 1726867664.67278: worker is 1 (out of 1 available) 30575 1726867664.67295: exiting _queue_task() for managed_node3/assert 30575 1726867664.67310: done queuing things up, now waiting for results queue to drain 30575 1726867664.67312: waiting for pending results... 30575 1726867664.67497: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' 30575 1726867664.67597: in run() - task 0affcac9-a3a5-e081-a588-000000001e9a 30575 1726867664.67610: variable 'ansible_search_path' from source: unknown 30575 1726867664.67615: variable 'ansible_search_path' from source: unknown 30575 1726867664.67650: calling self._execute() 30575 1726867664.67727: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.67735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.67745: variable 'omit' from source: magic vars 30575 1726867664.68035: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.68044: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.68050: variable 'omit' from source: magic vars 30575 1726867664.68083: variable 'omit' from source: magic vars 30575 1726867664.68159: variable 'profile' from source: play vars 30575 1726867664.68162: variable 'interface' from source: play vars 30575 1726867664.68215: variable 'interface' from source: play vars 30575 1726867664.68232: variable 'omit' from source: magic vars 30575 1726867664.68264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867664.68294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867664.68381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867664.68384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867664.68386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867664.68387: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867664.68390: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.68391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.68448: Set connection var ansible_pipelining to False 30575 1726867664.68451: Set connection var ansible_shell_type to sh 30575 1726867664.68456: Set connection var ansible_shell_executable to /bin/sh 30575 1726867664.68462: Set connection var ansible_timeout to 10 30575 1726867664.68467: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867664.68474: Set connection var ansible_connection to ssh 30575 1726867664.68493: variable 'ansible_shell_executable' from source: unknown 30575 1726867664.68497: variable 'ansible_connection' from source: unknown 30575 1726867664.68499: variable 'ansible_module_compression' from source: unknown 30575 1726867664.68502: variable 'ansible_shell_type' from source: unknown 30575 1726867664.68505: variable 'ansible_shell_executable' from source: unknown 30575 1726867664.68507: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.68509: variable 'ansible_pipelining' from source: unknown 30575 1726867664.68513: variable 'ansible_timeout' from source: unknown 30575 1726867664.68516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.68618: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867664.68636: variable 'omit' from source: magic vars 30575 1726867664.68639: starting attempt loop 30575 1726867664.68642: running the handler 30575 1726867664.68721: variable 'lsr_net_profile_exists' from source: set_fact 30575 1726867664.68725: Evaluated conditional (not lsr_net_profile_exists): True 30575 1726867664.68732: handler run complete 30575 1726867664.68745: attempt loop complete, returning result 30575 1726867664.68748: _execute() done 30575 1726867664.68751: dumping result to json 30575 1726867664.68754: done dumping result, returning 30575 1726867664.68761: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' [0affcac9-a3a5-e081-a588-000000001e9a] 30575 1726867664.68766: sending task result for task 0affcac9-a3a5-e081-a588-000000001e9a 30575 1726867664.68849: done sending task result for task 0affcac9-a3a5-e081-a588-000000001e9a 30575 1726867664.68851: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867664.68898: no more pending results, returning what we have 30575 1726867664.68901: results queue empty 30575 1726867664.68902: checking for any_errors_fatal 30575 1726867664.68912: done checking for any_errors_fatal 30575 1726867664.68912: checking for max_fail_percentage 30575 1726867664.68914: done checking for max_fail_percentage 30575 1726867664.68915: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.68916: done checking to see if all hosts have failed 30575 1726867664.68917: getting the remaining hosts for this loop 30575 1726867664.68918: done getting the remaining hosts for this loop 30575 1726867664.68921: getting the next task for host managed_node3 30575 1726867664.68931: done getting next task for host managed_node3 30575 1726867664.68934: ^ task is: TASK: Conditional asserts 30575 1726867664.68936: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.68942: getting variables 30575 1726867664.68943: in VariableManager get_vars() 30575 1726867664.68987: Calling all_inventory to load vars for managed_node3 30575 1726867664.68989: Calling groups_inventory to load vars for managed_node3 30575 1726867664.68993: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.69003: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.69006: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.69008: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.69810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.70680: done with get_vars() 30575 1726867664.70696: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:27:44 -0400 (0:00:00.037) 0:01:40.085 ****** 30575 1726867664.70762: entering _queue_task() for managed_node3/include_tasks 30575 1726867664.71005: worker is 1 (out of 1 available) 30575 1726867664.71018: exiting _queue_task() for managed_node3/include_tasks 30575 1726867664.71031: done queuing things up, now waiting for results queue to drain 30575 1726867664.71033: waiting for pending results... 30575 1726867664.71216: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30575 1726867664.71300: in run() - task 0affcac9-a3a5-e081-a588-00000000174a 30575 1726867664.71312: variable 'ansible_search_path' from source: unknown 30575 1726867664.71315: variable 'ansible_search_path' from source: unknown 30575 1726867664.71527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867664.73059: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867664.73108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867664.73138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867664.73162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867664.73184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867664.73549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867664.73571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867664.73590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867664.73616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867664.73629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867664.73711: variable 'lsr_assert_when' from source: include params 30575 1726867664.73795: variable 'network_provider' from source: set_fact 30575 1726867664.73854: variable 'omit' from source: magic vars 30575 1726867664.73926: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.73933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.73941: variable 'omit' from source: magic vars 30575 1726867664.74080: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.74092: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.74164: variable 'item' from source: unknown 30575 1726867664.74168: Evaluated conditional (item['condition']): True 30575 1726867664.74229: variable 'item' from source: unknown 30575 1726867664.74255: variable 'item' from source: unknown 30575 1726867664.74310: variable 'item' from source: unknown 30575 1726867664.74443: dumping result to json 30575 1726867664.74446: done dumping result, returning 30575 1726867664.74448: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-e081-a588-00000000174a] 30575 1726867664.74450: sending task result for task 0affcac9-a3a5-e081-a588-00000000174a 30575 1726867664.74488: done sending task result for task 0affcac9-a3a5-e081-a588-00000000174a 30575 1726867664.74490: WORKER PROCESS EXITING 30575 1726867664.74514: no more pending results, returning what we have 30575 1726867664.74519: in VariableManager get_vars() 30575 1726867664.74566: Calling all_inventory to load vars for managed_node3 30575 1726867664.74569: Calling groups_inventory to load vars for managed_node3 30575 1726867664.74572: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.74584: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.74587: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.74589: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.75535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.76430: done with get_vars() 30575 1726867664.76443: variable 'ansible_search_path' from source: unknown 30575 1726867664.76444: variable 'ansible_search_path' from source: unknown 30575 1726867664.76470: we have included files to process 30575 1726867664.76471: generating all_blocks data 30575 1726867664.76472: done generating all_blocks data 30575 1726867664.76476: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867664.76476: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867664.76480: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867664.76549: in VariableManager get_vars() 30575 1726867664.76563: done with get_vars() 30575 1726867664.76637: done processing included file 30575 1726867664.76639: iterating over new_blocks loaded from include file 30575 1726867664.76640: in VariableManager get_vars() 30575 1726867664.76653: done with get_vars() 30575 1726867664.76654: filtering new block on tags 30575 1726867664.76675: done filtering new block on tags 30575 1726867664.76679: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30575 1726867664.76682: extending task lists for all hosts with included blocks 30575 1726867664.77340: done extending task lists 30575 1726867664.77341: done processing included files 30575 1726867664.77342: results queue empty 30575 1726867664.77342: checking for any_errors_fatal 30575 1726867664.77344: done checking for any_errors_fatal 30575 1726867664.77345: checking for max_fail_percentage 30575 1726867664.77346: done checking for max_fail_percentage 30575 1726867664.77346: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.77347: done checking to see if all hosts have failed 30575 1726867664.77347: getting the remaining hosts for this loop 30575 1726867664.77348: done getting the remaining hosts for this loop 30575 1726867664.77350: getting the next task for host managed_node3 30575 1726867664.77353: done getting next task for host managed_node3 30575 1726867664.77354: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867664.77356: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.77361: getting variables 30575 1726867664.77362: in VariableManager get_vars() 30575 1726867664.77370: Calling all_inventory to load vars for managed_node3 30575 1726867664.77371: Calling groups_inventory to load vars for managed_node3 30575 1726867664.77373: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.77381: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.77383: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.77386: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.78031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.79358: done with get_vars() 30575 1726867664.79373: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:27:44 -0400 (0:00:00.086) 0:01:40.171 ****** 30575 1726867664.79425: entering _queue_task() for managed_node3/include_tasks 30575 1726867664.79689: worker is 1 (out of 1 available) 30575 1726867664.79704: exiting _queue_task() for managed_node3/include_tasks 30575 1726867664.79718: done queuing things up, now waiting for results queue to drain 30575 1726867664.79720: waiting for pending results... 30575 1726867664.79919: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867664.80001: in run() - task 0affcac9-a3a5-e081-a588-000000001f59 30575 1726867664.80013: variable 'ansible_search_path' from source: unknown 30575 1726867664.80017: variable 'ansible_search_path' from source: unknown 30575 1726867664.80050: calling self._execute() 30575 1726867664.80128: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.80132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.80142: variable 'omit' from source: magic vars 30575 1726867664.80429: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.80438: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.80443: _execute() done 30575 1726867664.80452: dumping result to json 30575 1726867664.80455: done dumping result, returning 30575 1726867664.80458: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-000000001f59] 30575 1726867664.80464: sending task result for task 0affcac9-a3a5-e081-a588-000000001f59 30575 1726867664.80547: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f59 30575 1726867664.80550: WORKER PROCESS EXITING 30575 1726867664.80576: no more pending results, returning what we have 30575 1726867664.80583: in VariableManager get_vars() 30575 1726867664.80631: Calling all_inventory to load vars for managed_node3 30575 1726867664.80633: Calling groups_inventory to load vars for managed_node3 30575 1726867664.80637: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.80648: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.80650: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.80653: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.81806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.83298: done with get_vars() 30575 1726867664.83317: variable 'ansible_search_path' from source: unknown 30575 1726867664.83319: variable 'ansible_search_path' from source: unknown 30575 1726867664.83455: variable 'item' from source: include params 30575 1726867664.83492: we have included files to process 30575 1726867664.83494: generating all_blocks data 30575 1726867664.83496: done generating all_blocks data 30575 1726867664.83497: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867664.83498: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867664.83500: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867664.83674: done processing included file 30575 1726867664.83676: iterating over new_blocks loaded from include file 30575 1726867664.83679: in VariableManager get_vars() 30575 1726867664.83697: done with get_vars() 30575 1726867664.83698: filtering new block on tags 30575 1726867664.83722: done filtering new block on tags 30575 1726867664.83725: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867664.83730: extending task lists for all hosts with included blocks 30575 1726867664.83883: done extending task lists 30575 1726867664.83885: done processing included files 30575 1726867664.83886: results queue empty 30575 1726867664.83886: checking for any_errors_fatal 30575 1726867664.83890: done checking for any_errors_fatal 30575 1726867664.83891: checking for max_fail_percentage 30575 1726867664.83892: done checking for max_fail_percentage 30575 1726867664.83892: checking to see if all hosts have failed and the running result is not ok 30575 1726867664.83893: done checking to see if all hosts have failed 30575 1726867664.83894: getting the remaining hosts for this loop 30575 1726867664.83895: done getting the remaining hosts for this loop 30575 1726867664.83898: getting the next task for host managed_node3 30575 1726867664.83902: done getting next task for host managed_node3 30575 1726867664.83904: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867664.83908: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867664.83910: getting variables 30575 1726867664.83911: in VariableManager get_vars() 30575 1726867664.83921: Calling all_inventory to load vars for managed_node3 30575 1726867664.83923: Calling groups_inventory to load vars for managed_node3 30575 1726867664.83925: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867664.83930: Calling all_plugins_play to load vars for managed_node3 30575 1726867664.83932: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867664.83935: Calling groups_plugins_play to load vars for managed_node3 30575 1726867664.85076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867664.86607: done with get_vars() 30575 1726867664.86629: done getting variables 30575 1726867664.86753: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:27:44 -0400 (0:00:00.073) 0:01:40.245 ****** 30575 1726867664.86788: entering _queue_task() for managed_node3/stat 30575 1726867664.87144: worker is 1 (out of 1 available) 30575 1726867664.87156: exiting _queue_task() for managed_node3/stat 30575 1726867664.87168: done queuing things up, now waiting for results queue to drain 30575 1726867664.87169: waiting for pending results... 30575 1726867664.87598: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867664.87624: in run() - task 0affcac9-a3a5-e081-a588-000000001fe8 30575 1726867664.87643: variable 'ansible_search_path' from source: unknown 30575 1726867664.87650: variable 'ansible_search_path' from source: unknown 30575 1726867664.87695: calling self._execute() 30575 1726867664.87789: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.87803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.87817: variable 'omit' from source: magic vars 30575 1726867664.88218: variable 'ansible_distribution_major_version' from source: facts 30575 1726867664.88344: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867664.88348: variable 'omit' from source: magic vars 30575 1726867664.88350: variable 'omit' from source: magic vars 30575 1726867664.88405: variable 'interface' from source: play vars 30575 1726867664.88427: variable 'omit' from source: magic vars 30575 1726867664.88479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867664.88519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867664.88546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867664.88567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867664.88592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867664.88627: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867664.88636: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.88643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.88747: Set connection var ansible_pipelining to False 30575 1726867664.88792: Set connection var ansible_shell_type to sh 30575 1726867664.88795: Set connection var ansible_shell_executable to /bin/sh 30575 1726867664.88798: Set connection var ansible_timeout to 10 30575 1726867664.88800: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867664.88801: Set connection var ansible_connection to ssh 30575 1726867664.88820: variable 'ansible_shell_executable' from source: unknown 30575 1726867664.88828: variable 'ansible_connection' from source: unknown 30575 1726867664.88835: variable 'ansible_module_compression' from source: unknown 30575 1726867664.88841: variable 'ansible_shell_type' from source: unknown 30575 1726867664.88900: variable 'ansible_shell_executable' from source: unknown 30575 1726867664.88903: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867664.88906: variable 'ansible_pipelining' from source: unknown 30575 1726867664.88908: variable 'ansible_timeout' from source: unknown 30575 1726867664.88911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867664.89079: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867664.89095: variable 'omit' from source: magic vars 30575 1726867664.89104: starting attempt loop 30575 1726867664.89114: running the handler 30575 1726867664.89137: _low_level_execute_command(): starting 30575 1726867664.89150: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867664.89973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.89980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.89993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.90081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.91842: stdout chunk (state=3): >>>/root <<< 30575 1726867664.91965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.91987: stdout chunk (state=3): >>><<< 30575 1726867664.92001: stderr chunk (state=3): >>><<< 30575 1726867664.92026: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867664.92046: _low_level_execute_command(): starting 30575 1726867664.92129: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909 `" && echo ansible-tmp-1726867664.9203272-35299-108693287519909="` echo /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909 `" ) && sleep 0' 30575 1726867664.92895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867664.92914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867664.92935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867664.92955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867664.92972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867664.92986: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867664.92999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.93087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867664.93101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867664.93138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867664.93155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.93173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.93256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.95140: stdout chunk (state=3): >>>ansible-tmp-1726867664.9203272-35299-108693287519909=/root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909 <<< 30575 1726867664.95287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.95299: stdout chunk (state=3): >>><<< 30575 1726867664.95311: stderr chunk (state=3): >>><<< 30575 1726867664.95337: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867664.9203272-35299-108693287519909=/root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867664.95392: variable 'ansible_module_compression' from source: unknown 30575 1726867664.95461: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867664.95510: variable 'ansible_facts' from source: unknown 30575 1726867664.95592: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/AnsiballZ_stat.py 30575 1726867664.95835: Sending initial data 30575 1726867664.95838: Sent initial data (153 bytes) 30575 1726867664.96394: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867664.96496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867664.96511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867664.96527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867664.96596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867664.98180: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867664.98237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867664.98293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvvw5vhwk /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/AnsiballZ_stat.py <<< 30575 1726867664.98304: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/AnsiballZ_stat.py" <<< 30575 1726867664.98336: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpvvw5vhwk" to remote "/root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/AnsiballZ_stat.py" <<< 30575 1726867664.99146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867664.99205: stderr chunk (state=3): >>><<< 30575 1726867664.99220: stdout chunk (state=3): >>><<< 30575 1726867664.99253: done transferring module to remote 30575 1726867664.99346: _low_level_execute_command(): starting 30575 1726867664.99352: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/ /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/AnsiballZ_stat.py && sleep 0' 30575 1726867664.99993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.00047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867665.00073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.00096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.00170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.02058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867665.02087: stdout chunk (state=3): >>><<< 30575 1726867665.02091: stderr chunk (state=3): >>><<< 30575 1726867665.02105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867665.02187: _low_level_execute_command(): starting 30575 1726867665.02192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/AnsiballZ_stat.py && sleep 0' 30575 1726867665.02765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867665.02873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.02898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.02978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.17973: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867665.19391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867665.19395: stdout chunk (state=3): >>><<< 30575 1726867665.19398: stderr chunk (state=3): >>><<< 30575 1726867665.19401: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867665.19404: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867665.19406: _low_level_execute_command(): starting 30575 1726867665.19408: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867664.9203272-35299-108693287519909/ > /dev/null 2>&1 && sleep 0' 30575 1726867665.20062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867665.20175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.20234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.20275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.22099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867665.22125: stderr chunk (state=3): >>><<< 30575 1726867665.22131: stdout chunk (state=3): >>><<< 30575 1726867665.22146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867665.22158: handler run complete 30575 1726867665.22174: attempt loop complete, returning result 30575 1726867665.22184: _execute() done 30575 1726867665.22188: dumping result to json 30575 1726867665.22191: done dumping result, returning 30575 1726867665.22202: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-000000001fe8] 30575 1726867665.22205: sending task result for task 0affcac9-a3a5-e081-a588-000000001fe8 30575 1726867665.22301: done sending task result for task 0affcac9-a3a5-e081-a588-000000001fe8 30575 1726867665.22304: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867665.22358: no more pending results, returning what we have 30575 1726867665.22362: results queue empty 30575 1726867665.22363: checking for any_errors_fatal 30575 1726867665.22364: done checking for any_errors_fatal 30575 1726867665.22365: checking for max_fail_percentage 30575 1726867665.22367: done checking for max_fail_percentage 30575 1726867665.22367: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.22368: done checking to see if all hosts have failed 30575 1726867665.22369: getting the remaining hosts for this loop 30575 1726867665.22371: done getting the remaining hosts for this loop 30575 1726867665.22375: getting the next task for host managed_node3 30575 1726867665.22389: done getting next task for host managed_node3 30575 1726867665.22391: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30575 1726867665.22396: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.22401: getting variables 30575 1726867665.22402: in VariableManager get_vars() 30575 1726867665.22448: Calling all_inventory to load vars for managed_node3 30575 1726867665.22451: Calling groups_inventory to load vars for managed_node3 30575 1726867665.22454: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.22465: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.22468: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.22470: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.23339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.24707: done with get_vars() 30575 1726867665.24740: done getting variables 30575 1726867665.24807: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867665.24934: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:27:45 -0400 (0:00:00.381) 0:01:40.627 ****** 30575 1726867665.24968: entering _queue_task() for managed_node3/assert 30575 1726867665.25356: worker is 1 (out of 1 available) 30575 1726867665.25371: exiting _queue_task() for managed_node3/assert 30575 1726867665.25388: done queuing things up, now waiting for results queue to drain 30575 1726867665.25390: waiting for pending results... 30575 1726867665.25628: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30575 1726867665.25722: in run() - task 0affcac9-a3a5-e081-a588-000000001f5a 30575 1726867665.25736: variable 'ansible_search_path' from source: unknown 30575 1726867665.25749: variable 'ansible_search_path' from source: unknown 30575 1726867665.25779: calling self._execute() 30575 1726867665.25859: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.25863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.25873: variable 'omit' from source: magic vars 30575 1726867665.26152: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.26161: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.26167: variable 'omit' from source: magic vars 30575 1726867665.26207: variable 'omit' from source: magic vars 30575 1726867665.26279: variable 'interface' from source: play vars 30575 1726867665.26299: variable 'omit' from source: magic vars 30575 1726867665.26331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867665.26357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.26373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867665.26388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.26398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.26426: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.26429: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.26431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.26501: Set connection var ansible_pipelining to False 30575 1726867665.26504: Set connection var ansible_shell_type to sh 30575 1726867665.26511: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.26525: Set connection var ansible_timeout to 10 30575 1726867665.26528: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.26530: Set connection var ansible_connection to ssh 30575 1726867665.26547: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.26550: variable 'ansible_connection' from source: unknown 30575 1726867665.26552: variable 'ansible_module_compression' from source: unknown 30575 1726867665.26555: variable 'ansible_shell_type' from source: unknown 30575 1726867665.26557: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.26559: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.26563: variable 'ansible_pipelining' from source: unknown 30575 1726867665.26565: variable 'ansible_timeout' from source: unknown 30575 1726867665.26569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.26672: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.26682: variable 'omit' from source: magic vars 30575 1726867665.26687: starting attempt loop 30575 1726867665.26690: running the handler 30575 1726867665.26798: variable 'interface_stat' from source: set_fact 30575 1726867665.26805: Evaluated conditional (not interface_stat.stat.exists): True 30575 1726867665.26810: handler run complete 30575 1726867665.26823: attempt loop complete, returning result 30575 1726867665.26825: _execute() done 30575 1726867665.26828: dumping result to json 30575 1726867665.26830: done dumping result, returning 30575 1726867665.26838: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcac9-a3a5-e081-a588-000000001f5a] 30575 1726867665.26841: sending task result for task 0affcac9-a3a5-e081-a588-000000001f5a 30575 1726867665.26927: done sending task result for task 0affcac9-a3a5-e081-a588-000000001f5a 30575 1726867665.26929: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867665.26998: no more pending results, returning what we have 30575 1726867665.27002: results queue empty 30575 1726867665.27003: checking for any_errors_fatal 30575 1726867665.27012: done checking for any_errors_fatal 30575 1726867665.27013: checking for max_fail_percentage 30575 1726867665.27015: done checking for max_fail_percentage 30575 1726867665.27018: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.27019: done checking to see if all hosts have failed 30575 1726867665.27020: getting the remaining hosts for this loop 30575 1726867665.27021: done getting the remaining hosts for this loop 30575 1726867665.27025: getting the next task for host managed_node3 30575 1726867665.27034: done getting next task for host managed_node3 30575 1726867665.27037: ^ task is: TASK: Success in test '{{ lsr_description }}' 30575 1726867665.27039: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.27043: getting variables 30575 1726867665.27044: in VariableManager get_vars() 30575 1726867665.27088: Calling all_inventory to load vars for managed_node3 30575 1726867665.27090: Calling groups_inventory to load vars for managed_node3 30575 1726867665.27094: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.27103: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.27106: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.27108: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.28542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.29394: done with get_vars() 30575 1726867665.29410: done getting variables 30575 1726867665.29454: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867665.29540: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:27:45 -0400 (0:00:00.045) 0:01:40.673 ****** 30575 1726867665.29563: entering _queue_task() for managed_node3/debug 30575 1726867665.29829: worker is 1 (out of 1 available) 30575 1726867665.29842: exiting _queue_task() for managed_node3/debug 30575 1726867665.29854: done queuing things up, now waiting for results queue to drain 30575 1726867665.29857: waiting for pending results... 30575 1726867665.30044: running TaskExecutor() for managed_node3/TASK: Success in test 'I can take a profile down that is absent' 30575 1726867665.30113: in run() - task 0affcac9-a3a5-e081-a588-00000000174b 30575 1726867665.30125: variable 'ansible_search_path' from source: unknown 30575 1726867665.30128: variable 'ansible_search_path' from source: unknown 30575 1726867665.30157: calling self._execute() 30575 1726867665.30235: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.30239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.30249: variable 'omit' from source: magic vars 30575 1726867665.30535: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.30545: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.30550: variable 'omit' from source: magic vars 30575 1726867665.30578: variable 'omit' from source: magic vars 30575 1726867665.30654: variable 'lsr_description' from source: include params 30575 1726867665.30668: variable 'omit' from source: magic vars 30575 1726867665.30702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867665.30730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.30748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867665.30761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.30771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.30798: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.30801: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.30803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.30873: Set connection var ansible_pipelining to False 30575 1726867665.30876: Set connection var ansible_shell_type to sh 30575 1726867665.30880: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.30887: Set connection var ansible_timeout to 10 30575 1726867665.30891: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.30897: Set connection var ansible_connection to ssh 30575 1726867665.30918: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.30921: variable 'ansible_connection' from source: unknown 30575 1726867665.30924: variable 'ansible_module_compression' from source: unknown 30575 1726867665.30926: variable 'ansible_shell_type' from source: unknown 30575 1726867665.30928: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.30930: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.30932: variable 'ansible_pipelining' from source: unknown 30575 1726867665.30934: variable 'ansible_timeout' from source: unknown 30575 1726867665.30938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.31040: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.31050: variable 'omit' from source: magic vars 30575 1726867665.31055: starting attempt loop 30575 1726867665.31057: running the handler 30575 1726867665.31099: handler run complete 30575 1726867665.31111: attempt loop complete, returning result 30575 1726867665.31114: _execute() done 30575 1726867665.31120: dumping result to json 30575 1726867665.31122: done dumping result, returning 30575 1726867665.31127: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can take a profile down that is absent' [0affcac9-a3a5-e081-a588-00000000174b] 30575 1726867665.31132: sending task result for task 0affcac9-a3a5-e081-a588-00000000174b 30575 1726867665.31214: done sending task result for task 0affcac9-a3a5-e081-a588-00000000174b 30575 1726867665.31219: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 30575 1726867665.31270: no more pending results, returning what we have 30575 1726867665.31274: results queue empty 30575 1726867665.31275: checking for any_errors_fatal 30575 1726867665.31284: done checking for any_errors_fatal 30575 1726867665.31285: checking for max_fail_percentage 30575 1726867665.31286: done checking for max_fail_percentage 30575 1726867665.31287: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.31288: done checking to see if all hosts have failed 30575 1726867665.31289: getting the remaining hosts for this loop 30575 1726867665.31290: done getting the remaining hosts for this loop 30575 1726867665.31294: getting the next task for host managed_node3 30575 1726867665.31302: done getting next task for host managed_node3 30575 1726867665.31305: ^ task is: TASK: Cleanup 30575 1726867665.31308: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.31313: getting variables 30575 1726867665.31314: in VariableManager get_vars() 30575 1726867665.31358: Calling all_inventory to load vars for managed_node3 30575 1726867665.31361: Calling groups_inventory to load vars for managed_node3 30575 1726867665.31364: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.31376: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.31385: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.31388: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.32191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.33082: done with get_vars() 30575 1726867665.33097: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:27:45 -0400 (0:00:00.035) 0:01:40.709 ****** 30575 1726867665.33163: entering _queue_task() for managed_node3/include_tasks 30575 1726867665.33390: worker is 1 (out of 1 available) 30575 1726867665.33403: exiting _queue_task() for managed_node3/include_tasks 30575 1726867665.33418: done queuing things up, now waiting for results queue to drain 30575 1726867665.33419: waiting for pending results... 30575 1726867665.33595: running TaskExecutor() for managed_node3/TASK: Cleanup 30575 1726867665.33659: in run() - task 0affcac9-a3a5-e081-a588-00000000174f 30575 1726867665.33671: variable 'ansible_search_path' from source: unknown 30575 1726867665.33675: variable 'ansible_search_path' from source: unknown 30575 1726867665.33710: variable 'lsr_cleanup' from source: include params 30575 1726867665.33861: variable 'lsr_cleanup' from source: include params 30575 1726867665.33921: variable 'omit' from source: magic vars 30575 1726867665.34085: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.34089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.34091: variable 'omit' from source: magic vars 30575 1726867665.34203: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.34211: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.34224: variable 'item' from source: unknown 30575 1726867665.34268: variable 'item' from source: unknown 30575 1726867665.34293: variable 'item' from source: unknown 30575 1726867665.34343: variable 'item' from source: unknown 30575 1726867665.34457: dumping result to json 30575 1726867665.34461: done dumping result, returning 30575 1726867665.34464: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-e081-a588-00000000174f] 30575 1726867665.34466: sending task result for task 0affcac9-a3a5-e081-a588-00000000174f 30575 1726867665.34505: done sending task result for task 0affcac9-a3a5-e081-a588-00000000174f 30575 1726867665.34508: WORKER PROCESS EXITING 30575 1726867665.34528: no more pending results, returning what we have 30575 1726867665.34532: in VariableManager get_vars() 30575 1726867665.34574: Calling all_inventory to load vars for managed_node3 30575 1726867665.34576: Calling groups_inventory to load vars for managed_node3 30575 1726867665.34581: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.34592: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.34594: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.34597: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.35483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.36552: done with get_vars() 30575 1726867665.36571: variable 'ansible_search_path' from source: unknown 30575 1726867665.36573: variable 'ansible_search_path' from source: unknown 30575 1726867665.36612: we have included files to process 30575 1726867665.36613: generating all_blocks data 30575 1726867665.36615: done generating all_blocks data 30575 1726867665.36618: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867665.36620: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867665.36622: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867665.36809: done processing included file 30575 1726867665.36811: iterating over new_blocks loaded from include file 30575 1726867665.36813: in VariableManager get_vars() 30575 1726867665.36829: done with get_vars() 30575 1726867665.36831: filtering new block on tags 30575 1726867665.36856: done filtering new block on tags 30575 1726867665.36859: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30575 1726867665.36864: extending task lists for all hosts with included blocks 30575 1726867665.37984: done extending task lists 30575 1726867665.37986: done processing included files 30575 1726867665.37987: results queue empty 30575 1726867665.37987: checking for any_errors_fatal 30575 1726867665.37990: done checking for any_errors_fatal 30575 1726867665.37991: checking for max_fail_percentage 30575 1726867665.37991: done checking for max_fail_percentage 30575 1726867665.37992: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.37993: done checking to see if all hosts have failed 30575 1726867665.37993: getting the remaining hosts for this loop 30575 1726867665.37994: done getting the remaining hosts for this loop 30575 1726867665.37996: getting the next task for host managed_node3 30575 1726867665.37999: done getting next task for host managed_node3 30575 1726867665.38000: ^ task is: TASK: Cleanup profile and device 30575 1726867665.38002: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.38004: getting variables 30575 1726867665.38004: in VariableManager get_vars() 30575 1726867665.38012: Calling all_inventory to load vars for managed_node3 30575 1726867665.38014: Calling groups_inventory to load vars for managed_node3 30575 1726867665.38015: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.38020: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.38021: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.38023: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.38671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.39562: done with get_vars() 30575 1726867665.39579: done getting variables 30575 1726867665.39607: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 17:27:45 -0400 (0:00:00.064) 0:01:40.773 ****** 30575 1726867665.39630: entering _queue_task() for managed_node3/shell 30575 1726867665.39871: worker is 1 (out of 1 available) 30575 1726867665.39885: exiting _queue_task() for managed_node3/shell 30575 1726867665.39895: done queuing things up, now waiting for results queue to drain 30575 1726867665.39896: waiting for pending results... 30575 1726867665.40087: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30575 1726867665.40157: in run() - task 0affcac9-a3a5-e081-a588-00000000200b 30575 1726867665.40169: variable 'ansible_search_path' from source: unknown 30575 1726867665.40173: variable 'ansible_search_path' from source: unknown 30575 1726867665.40204: calling self._execute() 30575 1726867665.40279: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.40283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.40292: variable 'omit' from source: magic vars 30575 1726867665.40579: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.40589: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.40594: variable 'omit' from source: magic vars 30575 1726867665.40627: variable 'omit' from source: magic vars 30575 1726867665.40729: variable 'interface' from source: play vars 30575 1726867665.40745: variable 'omit' from source: magic vars 30575 1726867665.40783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867665.40807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.40823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867665.40836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.40848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.40873: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.40876: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.40881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.40951: Set connection var ansible_pipelining to False 30575 1726867665.40954: Set connection var ansible_shell_type to sh 30575 1726867665.40957: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.40963: Set connection var ansible_timeout to 10 30575 1726867665.40968: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.40974: Set connection var ansible_connection to ssh 30575 1726867665.40997: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.41002: variable 'ansible_connection' from source: unknown 30575 1726867665.41005: variable 'ansible_module_compression' from source: unknown 30575 1726867665.41007: variable 'ansible_shell_type' from source: unknown 30575 1726867665.41010: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.41013: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.41015: variable 'ansible_pipelining' from source: unknown 30575 1726867665.41019: variable 'ansible_timeout' from source: unknown 30575 1726867665.41022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.41120: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.41128: variable 'omit' from source: magic vars 30575 1726867665.41134: starting attempt loop 30575 1726867665.41141: running the handler 30575 1726867665.41150: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.41167: _low_level_execute_command(): starting 30575 1726867665.41174: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867665.41670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867665.41708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.41711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867665.41714: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867665.41720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.41763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867665.41766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.41768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.41829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.43529: stdout chunk (state=3): >>>/root <<< 30575 1726867665.43629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867665.43656: stderr chunk (state=3): >>><<< 30575 1726867665.43660: stdout chunk (state=3): >>><<< 30575 1726867665.43681: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867665.43692: _low_level_execute_command(): starting 30575 1726867665.43698: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925 `" && echo ansible-tmp-1726867665.4368024-35323-265553170497925="` echo /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925 `" ) && sleep 0' 30575 1726867665.44125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867665.44135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867665.44138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.44140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867665.44143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.44187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867665.44190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.44195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.44250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.46129: stdout chunk (state=3): >>>ansible-tmp-1726867665.4368024-35323-265553170497925=/root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925 <<< 30575 1726867665.46244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867665.46270: stderr chunk (state=3): >>><<< 30575 1726867665.46273: stdout chunk (state=3): >>><<< 30575 1726867665.46289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867665.4368024-35323-265553170497925=/root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867665.46313: variable 'ansible_module_compression' from source: unknown 30575 1726867665.46354: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867665.46392: variable 'ansible_facts' from source: unknown 30575 1726867665.46440: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/AnsiballZ_command.py 30575 1726867665.46537: Sending initial data 30575 1726867665.46540: Sent initial data (156 bytes) 30575 1726867665.47468: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.47493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.47566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.49096: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867665.49157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867665.49231: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpfobu3a36 /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/AnsiballZ_command.py <<< 30575 1726867665.49234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/AnsiballZ_command.py" <<< 30575 1726867665.49270: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpfobu3a36" to remote "/root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/AnsiballZ_command.py" <<< 30575 1726867665.49990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867665.50020: stderr chunk (state=3): >>><<< 30575 1726867665.50161: stdout chunk (state=3): >>><<< 30575 1726867665.50164: done transferring module to remote 30575 1726867665.50167: _low_level_execute_command(): starting 30575 1726867665.50169: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/ /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/AnsiballZ_command.py && sleep 0' 30575 1726867665.50784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867665.50791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867665.50794: stderr chunk (state=3): >>>debug2: match found <<< 30575 1726867665.50806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.50849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867665.50863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.50886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.50956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.52858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867665.52863: stdout chunk (state=3): >>><<< 30575 1726867665.52865: stderr chunk (state=3): >>><<< 30575 1726867665.52868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867665.52870: _low_level_execute_command(): starting 30575 1726867665.52872: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/AnsiballZ_command.py && sleep 0' 30575 1726867665.53492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867665.53495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.53557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867665.53575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.53596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.53672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.71935: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:27:45.686085", "end": "2024-09-20 17:27:45.717236", "delta": "0:00:00.031151", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867665.73401: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867665.73432: stderr chunk (state=3): >>><<< 30575 1726867665.73435: stdout chunk (state=3): >>><<< 30575 1726867665.73451: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:27:45.686085", "end": "2024-09-20 17:27:45.717236", "delta": "0:00:00.031151", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867665.73487: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867665.73495: _low_level_execute_command(): starting 30575 1726867665.73498: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867665.4368024-35323-265553170497925/ > /dev/null 2>&1 && sleep 0' 30575 1726867665.73956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867665.73959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867665.73962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867665.73964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867665.74019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867665.74023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867665.74029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867665.74073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867665.75878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867665.75907: stderr chunk (state=3): >>><<< 30575 1726867665.75910: stdout chunk (state=3): >>><<< 30575 1726867665.75925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867665.75931: handler run complete 30575 1726867665.75948: Evaluated conditional (False): False 30575 1726867665.75956: attempt loop complete, returning result 30575 1726867665.75959: _execute() done 30575 1726867665.75961: dumping result to json 30575 1726867665.75966: done dumping result, returning 30575 1726867665.75974: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcac9-a3a5-e081-a588-00000000200b] 30575 1726867665.75980: sending task result for task 0affcac9-a3a5-e081-a588-00000000200b 30575 1726867665.76078: done sending task result for task 0affcac9-a3a5-e081-a588-00000000200b 30575 1726867665.76082: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.031151", "end": "2024-09-20 17:27:45.717236", "rc": 1, "start": "2024-09-20 17:27:45.686085" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30575 1726867665.76145: no more pending results, returning what we have 30575 1726867665.76149: results queue empty 30575 1726867665.76150: checking for any_errors_fatal 30575 1726867665.76151: done checking for any_errors_fatal 30575 1726867665.76151: checking for max_fail_percentage 30575 1726867665.76153: done checking for max_fail_percentage 30575 1726867665.76154: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.76155: done checking to see if all hosts have failed 30575 1726867665.76156: getting the remaining hosts for this loop 30575 1726867665.76157: done getting the remaining hosts for this loop 30575 1726867665.76161: getting the next task for host managed_node3 30575 1726867665.76172: done getting next task for host managed_node3 30575 1726867665.76175: ^ task is: TASK: Include the task 'run_test.yml' 30575 1726867665.76184: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.76189: getting variables 30575 1726867665.76191: in VariableManager get_vars() 30575 1726867665.76236: Calling all_inventory to load vars for managed_node3 30575 1726867665.76238: Calling groups_inventory to load vars for managed_node3 30575 1726867665.76242: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.76253: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.76255: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.76258: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.77102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.77982: done with get_vars() 30575 1726867665.77998: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Friday 20 September 2024 17:27:45 -0400 (0:00:00.384) 0:01:41.158 ****** 30575 1726867665.78068: entering _queue_task() for managed_node3/include_tasks 30575 1726867665.78302: worker is 1 (out of 1 available) 30575 1726867665.78314: exiting _queue_task() for managed_node3/include_tasks 30575 1726867665.78329: done queuing things up, now waiting for results queue to drain 30575 1726867665.78331: waiting for pending results... 30575 1726867665.78523: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30575 1726867665.78587: in run() - task 0affcac9-a3a5-e081-a588-000000000017 30575 1726867665.78599: variable 'ansible_search_path' from source: unknown 30575 1726867665.78629: calling self._execute() 30575 1726867665.78698: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.78703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.78714: variable 'omit' from source: magic vars 30575 1726867665.79007: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.79019: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.79023: _execute() done 30575 1726867665.79027: dumping result to json 30575 1726867665.79030: done dumping result, returning 30575 1726867665.79037: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-e081-a588-000000000017] 30575 1726867665.79043: sending task result for task 0affcac9-a3a5-e081-a588-000000000017 30575 1726867665.79146: done sending task result for task 0affcac9-a3a5-e081-a588-000000000017 30575 1726867665.79149: WORKER PROCESS EXITING 30575 1726867665.79174: no more pending results, returning what we have 30575 1726867665.79181: in VariableManager get_vars() 30575 1726867665.79231: Calling all_inventory to load vars for managed_node3 30575 1726867665.79233: Calling groups_inventory to load vars for managed_node3 30575 1726867665.79237: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.79247: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.79250: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.79252: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.80176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.81027: done with get_vars() 30575 1726867665.81040: variable 'ansible_search_path' from source: unknown 30575 1726867665.81050: we have included files to process 30575 1726867665.81051: generating all_blocks data 30575 1726867665.81052: done generating all_blocks data 30575 1726867665.81057: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867665.81058: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867665.81059: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30575 1726867665.81312: in VariableManager get_vars() 30575 1726867665.81326: done with get_vars() 30575 1726867665.81351: in VariableManager get_vars() 30575 1726867665.81361: done with get_vars() 30575 1726867665.81389: in VariableManager get_vars() 30575 1726867665.81400: done with get_vars() 30575 1726867665.81427: in VariableManager get_vars() 30575 1726867665.81438: done with get_vars() 30575 1726867665.81462: in VariableManager get_vars() 30575 1726867665.81472: done with get_vars() 30575 1726867665.81731: in VariableManager get_vars() 30575 1726867665.81742: done with get_vars() 30575 1726867665.81749: done processing included file 30575 1726867665.81750: iterating over new_blocks loaded from include file 30575 1726867665.81751: in VariableManager get_vars() 30575 1726867665.81758: done with get_vars() 30575 1726867665.81758: filtering new block on tags 30575 1726867665.81820: done filtering new block on tags 30575 1726867665.81822: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30575 1726867665.81826: extending task lists for all hosts with included blocks 30575 1726867665.81847: done extending task lists 30575 1726867665.81848: done processing included files 30575 1726867665.81848: results queue empty 30575 1726867665.81848: checking for any_errors_fatal 30575 1726867665.81851: done checking for any_errors_fatal 30575 1726867665.81852: checking for max_fail_percentage 30575 1726867665.81852: done checking for max_fail_percentage 30575 1726867665.81853: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.81853: done checking to see if all hosts have failed 30575 1726867665.81854: getting the remaining hosts for this loop 30575 1726867665.81855: done getting the remaining hosts for this loop 30575 1726867665.81856: getting the next task for host managed_node3 30575 1726867665.81859: done getting next task for host managed_node3 30575 1726867665.81860: ^ task is: TASK: TEST: {{ lsr_description }} 30575 1726867665.81862: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.81863: getting variables 30575 1726867665.81864: in VariableManager get_vars() 30575 1726867665.81870: Calling all_inventory to load vars for managed_node3 30575 1726867665.81871: Calling groups_inventory to load vars for managed_node3 30575 1726867665.81872: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.81876: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.81880: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.81882: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.82511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.83413: done with get_vars() 30575 1726867665.83429: done getting variables 30575 1726867665.83457: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867665.83537: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:27:45 -0400 (0:00:00.054) 0:01:41.213 ****** 30575 1726867665.83559: entering _queue_task() for managed_node3/debug 30575 1726867665.83798: worker is 1 (out of 1 available) 30575 1726867665.83811: exiting _queue_task() for managed_node3/debug 30575 1726867665.83826: done queuing things up, now waiting for results queue to drain 30575 1726867665.83828: waiting for pending results... 30575 1726867665.84006: running TaskExecutor() for managed_node3/TASK: TEST: I will not get an error when I try to remove an absent profile 30575 1726867665.84083: in run() - task 0affcac9-a3a5-e081-a588-0000000020ad 30575 1726867665.84095: variable 'ansible_search_path' from source: unknown 30575 1726867665.84099: variable 'ansible_search_path' from source: unknown 30575 1726867665.84126: calling self._execute() 30575 1726867665.84197: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.84201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.84211: variable 'omit' from source: magic vars 30575 1726867665.84480: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.84497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.84500: variable 'omit' from source: magic vars 30575 1726867665.84521: variable 'omit' from source: magic vars 30575 1726867665.84592: variable 'lsr_description' from source: include params 30575 1726867665.84609: variable 'omit' from source: magic vars 30575 1726867665.84641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867665.84667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.84684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867665.84697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.84714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.84734: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.84737: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.84740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.84809: Set connection var ansible_pipelining to False 30575 1726867665.84814: Set connection var ansible_shell_type to sh 30575 1726867665.84819: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.84822: Set connection var ansible_timeout to 10 30575 1726867665.84833: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.84836: Set connection var ansible_connection to ssh 30575 1726867665.84852: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.84855: variable 'ansible_connection' from source: unknown 30575 1726867665.84858: variable 'ansible_module_compression' from source: unknown 30575 1726867665.84861: variable 'ansible_shell_type' from source: unknown 30575 1726867665.84864: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.84866: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.84868: variable 'ansible_pipelining' from source: unknown 30575 1726867665.84870: variable 'ansible_timeout' from source: unknown 30575 1726867665.84874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.84972: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.84982: variable 'omit' from source: magic vars 30575 1726867665.84987: starting attempt loop 30575 1726867665.84990: running the handler 30575 1726867665.85026: handler run complete 30575 1726867665.85036: attempt loop complete, returning result 30575 1726867665.85040: _execute() done 30575 1726867665.85043: dumping result to json 30575 1726867665.85046: done dumping result, returning 30575 1726867665.85056: done running TaskExecutor() for managed_node3/TASK: TEST: I will not get an error when I try to remove an absent profile [0affcac9-a3a5-e081-a588-0000000020ad] 30575 1726867665.85058: sending task result for task 0affcac9-a3a5-e081-a588-0000000020ad 30575 1726867665.85133: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020ad 30575 1726867665.85135: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 30575 1726867665.85200: no more pending results, returning what we have 30575 1726867665.85203: results queue empty 30575 1726867665.85204: checking for any_errors_fatal 30575 1726867665.85205: done checking for any_errors_fatal 30575 1726867665.85206: checking for max_fail_percentage 30575 1726867665.85207: done checking for max_fail_percentage 30575 1726867665.85209: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.85210: done checking to see if all hosts have failed 30575 1726867665.85210: getting the remaining hosts for this loop 30575 1726867665.85212: done getting the remaining hosts for this loop 30575 1726867665.85215: getting the next task for host managed_node3 30575 1726867665.85224: done getting next task for host managed_node3 30575 1726867665.85226: ^ task is: TASK: Show item 30575 1726867665.85229: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.85232: getting variables 30575 1726867665.85233: in VariableManager get_vars() 30575 1726867665.85267: Calling all_inventory to load vars for managed_node3 30575 1726867665.85269: Calling groups_inventory to load vars for managed_node3 30575 1726867665.85273: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.85283: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.85286: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.85288: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.86056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.86926: done with get_vars() 30575 1726867665.86942: done getting variables 30575 1726867665.86980: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:27:45 -0400 (0:00:00.034) 0:01:41.247 ****** 30575 1726867665.87000: entering _queue_task() for managed_node3/debug 30575 1726867665.87207: worker is 1 (out of 1 available) 30575 1726867665.87223: exiting _queue_task() for managed_node3/debug 30575 1726867665.87235: done queuing things up, now waiting for results queue to drain 30575 1726867665.87237: waiting for pending results... 30575 1726867665.87402: running TaskExecutor() for managed_node3/TASK: Show item 30575 1726867665.87470: in run() - task 0affcac9-a3a5-e081-a588-0000000020ae 30575 1726867665.87485: variable 'ansible_search_path' from source: unknown 30575 1726867665.87488: variable 'ansible_search_path' from source: unknown 30575 1726867665.87527: variable 'omit' from source: magic vars 30575 1726867665.87630: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.87637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.87647: variable 'omit' from source: magic vars 30575 1726867665.87896: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.87909: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.87912: variable 'omit' from source: magic vars 30575 1726867665.87937: variable 'omit' from source: magic vars 30575 1726867665.87964: variable 'item' from source: unknown 30575 1726867665.88021: variable 'item' from source: unknown 30575 1726867665.88030: variable 'omit' from source: magic vars 30575 1726867665.88060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867665.88087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.88102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867665.88119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.88130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.88150: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.88153: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.88156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.88221: Set connection var ansible_pipelining to False 30575 1726867665.88225: Set connection var ansible_shell_type to sh 30575 1726867665.88229: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.88233: Set connection var ansible_timeout to 10 30575 1726867665.88243: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.88249: Set connection var ansible_connection to ssh 30575 1726867665.88265: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.88268: variable 'ansible_connection' from source: unknown 30575 1726867665.88270: variable 'ansible_module_compression' from source: unknown 30575 1726867665.88273: variable 'ansible_shell_type' from source: unknown 30575 1726867665.88275: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.88280: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.88284: variable 'ansible_pipelining' from source: unknown 30575 1726867665.88286: variable 'ansible_timeout' from source: unknown 30575 1726867665.88290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.88386: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.88394: variable 'omit' from source: magic vars 30575 1726867665.88399: starting attempt loop 30575 1726867665.88402: running the handler 30575 1726867665.88438: variable 'lsr_description' from source: include params 30575 1726867665.88487: variable 'lsr_description' from source: include params 30575 1726867665.88494: handler run complete 30575 1726867665.88508: attempt loop complete, returning result 30575 1726867665.88569: variable 'item' from source: unknown 30575 1726867665.88572: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 30575 1726867665.88700: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.88704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.88707: variable 'omit' from source: magic vars 30575 1726867665.88775: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.88780: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.88785: variable 'omit' from source: magic vars 30575 1726867665.88796: variable 'omit' from source: magic vars 30575 1726867665.88829: variable 'item' from source: unknown 30575 1726867665.88867: variable 'item' from source: unknown 30575 1726867665.88880: variable 'omit' from source: magic vars 30575 1726867665.88894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.88901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.88906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.88918: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.88921: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.88923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.88968: Set connection var ansible_pipelining to False 30575 1726867665.88971: Set connection var ansible_shell_type to sh 30575 1726867665.88973: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.88980: Set connection var ansible_timeout to 10 30575 1726867665.88985: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.88991: Set connection var ansible_connection to ssh 30575 1726867665.89005: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.89008: variable 'ansible_connection' from source: unknown 30575 1726867665.89010: variable 'ansible_module_compression' from source: unknown 30575 1726867665.89013: variable 'ansible_shell_type' from source: unknown 30575 1726867665.89017: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.89020: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.89022: variable 'ansible_pipelining' from source: unknown 30575 1726867665.89024: variable 'ansible_timeout' from source: unknown 30575 1726867665.89026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.89083: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.89091: variable 'omit' from source: magic vars 30575 1726867665.89093: starting attempt loop 30575 1726867665.89096: running the handler 30575 1726867665.89113: variable 'lsr_setup' from source: include params 30575 1726867665.89163: variable 'lsr_setup' from source: include params 30575 1726867665.89200: handler run complete 30575 1726867665.89211: attempt loop complete, returning result 30575 1726867665.89222: variable 'item' from source: unknown 30575 1726867665.89268: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 30575 1726867665.89345: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.89349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.89351: variable 'omit' from source: magic vars 30575 1726867665.89450: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.89453: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.89457: variable 'omit' from source: magic vars 30575 1726867665.89467: variable 'omit' from source: magic vars 30575 1726867665.89497: variable 'item' from source: unknown 30575 1726867665.89540: variable 'item' from source: unknown 30575 1726867665.89551: variable 'omit' from source: magic vars 30575 1726867665.89564: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.89569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.89575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.89591: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.89594: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.89596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.89636: Set connection var ansible_pipelining to False 30575 1726867665.89639: Set connection var ansible_shell_type to sh 30575 1726867665.89642: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.89647: Set connection var ansible_timeout to 10 30575 1726867665.89652: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.89658: Set connection var ansible_connection to ssh 30575 1726867665.89673: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.89676: variable 'ansible_connection' from source: unknown 30575 1726867665.89680: variable 'ansible_module_compression' from source: unknown 30575 1726867665.89682: variable 'ansible_shell_type' from source: unknown 30575 1726867665.89685: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.89688: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.89690: variable 'ansible_pipelining' from source: unknown 30575 1726867665.89694: variable 'ansible_timeout' from source: unknown 30575 1726867665.89696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.89752: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.89757: variable 'omit' from source: magic vars 30575 1726867665.89760: starting attempt loop 30575 1726867665.89763: running the handler 30575 1726867665.89779: variable 'lsr_test' from source: include params 30575 1726867665.89826: variable 'lsr_test' from source: include params 30575 1726867665.89838: handler run complete 30575 1726867665.89848: attempt loop complete, returning result 30575 1726867665.89859: variable 'item' from source: unknown 30575 1726867665.89902: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30575 1726867665.89975: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.89981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.89983: variable 'omit' from source: magic vars 30575 1726867665.90079: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.90082: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.90085: variable 'omit' from source: magic vars 30575 1726867665.90102: variable 'omit' from source: magic vars 30575 1726867665.90126: variable 'item' from source: unknown 30575 1726867665.90168: variable 'item' from source: unknown 30575 1726867665.90180: variable 'omit' from source: magic vars 30575 1726867665.90193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.90199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.90203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.90215: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.90220: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.90222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.90263: Set connection var ansible_pipelining to False 30575 1726867665.90266: Set connection var ansible_shell_type to sh 30575 1726867665.90269: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.90274: Set connection var ansible_timeout to 10 30575 1726867665.90280: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.90286: Set connection var ansible_connection to ssh 30575 1726867665.90300: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.90303: variable 'ansible_connection' from source: unknown 30575 1726867665.90305: variable 'ansible_module_compression' from source: unknown 30575 1726867665.90307: variable 'ansible_shell_type' from source: unknown 30575 1726867665.90309: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.90313: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.90318: variable 'ansible_pipelining' from source: unknown 30575 1726867665.90320: variable 'ansible_timeout' from source: unknown 30575 1726867665.90322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.90376: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.90383: variable 'omit' from source: magic vars 30575 1726867665.90387: starting attempt loop 30575 1726867665.90389: running the handler 30575 1726867665.90403: variable 'lsr_assert' from source: include params 30575 1726867665.90447: variable 'lsr_assert' from source: include params 30575 1726867665.90461: handler run complete 30575 1726867665.90471: attempt loop complete, returning result 30575 1726867665.90483: variable 'item' from source: unknown 30575 1726867665.90525: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 30575 1726867665.90595: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.90599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.90602: variable 'omit' from source: magic vars 30575 1726867665.90735: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.90743: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.90745: variable 'omit' from source: magic vars 30575 1726867665.90754: variable 'omit' from source: magic vars 30575 1726867665.90781: variable 'item' from source: unknown 30575 1726867665.90824: variable 'item' from source: unknown 30575 1726867665.90836: variable 'omit' from source: magic vars 30575 1726867665.90850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.90856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.90862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.90871: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.90874: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.90876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.90920: Set connection var ansible_pipelining to False 30575 1726867665.90923: Set connection var ansible_shell_type to sh 30575 1726867665.90926: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.90928: Set connection var ansible_timeout to 10 30575 1726867665.90935: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.90946: Set connection var ansible_connection to ssh 30575 1726867665.90957: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.90959: variable 'ansible_connection' from source: unknown 30575 1726867665.90962: variable 'ansible_module_compression' from source: unknown 30575 1726867665.90964: variable 'ansible_shell_type' from source: unknown 30575 1726867665.90966: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.90968: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.90972: variable 'ansible_pipelining' from source: unknown 30575 1726867665.90975: variable 'ansible_timeout' from source: unknown 30575 1726867665.90980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.91034: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.91039: variable 'omit' from source: magic vars 30575 1726867665.91046: starting attempt loop 30575 1726867665.91048: running the handler 30575 1726867665.91063: variable 'lsr_assert_when' from source: include params 30575 1726867665.91106: variable 'lsr_assert_when' from source: include params 30575 1726867665.91167: variable 'network_provider' from source: set_fact 30575 1726867665.91189: handler run complete 30575 1726867665.91199: attempt loop complete, returning result 30575 1726867665.91210: variable 'item' from source: unknown 30575 1726867665.91251: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30575 1726867665.91327: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.91330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.91333: variable 'omit' from source: magic vars 30575 1726867665.91425: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.91428: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.91431: variable 'omit' from source: magic vars 30575 1726867665.91444: variable 'omit' from source: magic vars 30575 1726867665.91470: variable 'item' from source: unknown 30575 1726867665.91513: variable 'item' from source: unknown 30575 1726867665.91524: variable 'omit' from source: magic vars 30575 1726867665.91537: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.91543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.91549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.91558: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.91567: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.91569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.91612: Set connection var ansible_pipelining to False 30575 1726867665.91615: Set connection var ansible_shell_type to sh 30575 1726867665.91621: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.91626: Set connection var ansible_timeout to 10 30575 1726867665.91631: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.91637: Set connection var ansible_connection to ssh 30575 1726867665.91652: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.91655: variable 'ansible_connection' from source: unknown 30575 1726867665.91657: variable 'ansible_module_compression' from source: unknown 30575 1726867665.91660: variable 'ansible_shell_type' from source: unknown 30575 1726867665.91662: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.91675: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.91679: variable 'ansible_pipelining' from source: unknown 30575 1726867665.91681: variable 'ansible_timeout' from source: unknown 30575 1726867665.91683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.91735: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.91741: variable 'omit' from source: magic vars 30575 1726867665.91744: starting attempt loop 30575 1726867665.91746: running the handler 30575 1726867665.91760: variable 'lsr_fail_debug' from source: play vars 30575 1726867665.91807: variable 'lsr_fail_debug' from source: play vars 30575 1726867665.91821: handler run complete 30575 1726867665.91831: attempt loop complete, returning result 30575 1726867665.91843: variable 'item' from source: unknown 30575 1726867665.91885: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30575 1726867665.91952: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.91958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.91965: variable 'omit' from source: magic vars 30575 1726867665.92063: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.92067: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.92071: variable 'omit' from source: magic vars 30575 1726867665.92083: variable 'omit' from source: magic vars 30575 1726867665.92113: variable 'item' from source: unknown 30575 1726867665.92153: variable 'item' from source: unknown 30575 1726867665.92164: variable 'omit' from source: magic vars 30575 1726867665.92178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867665.92186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.92189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867665.92198: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867665.92200: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.92204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.92250: Set connection var ansible_pipelining to False 30575 1726867665.92253: Set connection var ansible_shell_type to sh 30575 1726867665.92255: Set connection var ansible_shell_executable to /bin/sh 30575 1726867665.92261: Set connection var ansible_timeout to 10 30575 1726867665.92265: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867665.92271: Set connection var ansible_connection to ssh 30575 1726867665.92287: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.92290: variable 'ansible_connection' from source: unknown 30575 1726867665.92292: variable 'ansible_module_compression' from source: unknown 30575 1726867665.92294: variable 'ansible_shell_type' from source: unknown 30575 1726867665.92296: variable 'ansible_shell_executable' from source: unknown 30575 1726867665.92298: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.92303: variable 'ansible_pipelining' from source: unknown 30575 1726867665.92305: variable 'ansible_timeout' from source: unknown 30575 1726867665.92309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.92366: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867665.92372: variable 'omit' from source: magic vars 30575 1726867665.92374: starting attempt loop 30575 1726867665.92378: running the handler 30575 1726867665.92395: variable 'lsr_cleanup' from source: include params 30575 1726867665.92443: variable 'lsr_cleanup' from source: include params 30575 1726867665.92455: handler run complete 30575 1726867665.92465: attempt loop complete, returning result 30575 1726867665.92476: variable 'item' from source: unknown 30575 1726867665.92517: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 30575 1726867665.92590: dumping result to json 30575 1726867665.92593: done dumping result, returning 30575 1726867665.92596: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-e081-a588-0000000020ae] 30575 1726867665.92598: sending task result for task 0affcac9-a3a5-e081-a588-0000000020ae 30575 1726867665.92637: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020ae 30575 1726867665.92640: WORKER PROCESS EXITING 30575 1726867665.92691: no more pending results, returning what we have 30575 1726867665.92694: results queue empty 30575 1726867665.92695: checking for any_errors_fatal 30575 1726867665.92701: done checking for any_errors_fatal 30575 1726867665.92702: checking for max_fail_percentage 30575 1726867665.92704: done checking for max_fail_percentage 30575 1726867665.92705: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.92705: done checking to see if all hosts have failed 30575 1726867665.92706: getting the remaining hosts for this loop 30575 1726867665.92708: done getting the remaining hosts for this loop 30575 1726867665.92712: getting the next task for host managed_node3 30575 1726867665.92719: done getting next task for host managed_node3 30575 1726867665.92722: ^ task is: TASK: Include the task 'show_interfaces.yml' 30575 1726867665.92724: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.92728: getting variables 30575 1726867665.92730: in VariableManager get_vars() 30575 1726867665.92769: Calling all_inventory to load vars for managed_node3 30575 1726867665.92771: Calling groups_inventory to load vars for managed_node3 30575 1726867665.92774: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.92785: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.92788: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.92790: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.93727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.94575: done with get_vars() 30575 1726867665.94592: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:27:45 -0400 (0:00:00.076) 0:01:41.324 ****** 30575 1726867665.94653: entering _queue_task() for managed_node3/include_tasks 30575 1726867665.94865: worker is 1 (out of 1 available) 30575 1726867665.94881: exiting _queue_task() for managed_node3/include_tasks 30575 1726867665.94894: done queuing things up, now waiting for results queue to drain 30575 1726867665.94896: waiting for pending results... 30575 1726867665.95069: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30575 1726867665.95147: in run() - task 0affcac9-a3a5-e081-a588-0000000020af 30575 1726867665.95157: variable 'ansible_search_path' from source: unknown 30575 1726867665.95160: variable 'ansible_search_path' from source: unknown 30575 1726867665.95190: calling self._execute() 30575 1726867665.95257: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867665.95261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867665.95271: variable 'omit' from source: magic vars 30575 1726867665.95534: variable 'ansible_distribution_major_version' from source: facts 30575 1726867665.95542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867665.95548: _execute() done 30575 1726867665.95554: dumping result to json 30575 1726867665.95557: done dumping result, returning 30575 1726867665.95568: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-e081-a588-0000000020af] 30575 1726867665.95570: sending task result for task 0affcac9-a3a5-e081-a588-0000000020af 30575 1726867665.95646: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020af 30575 1726867665.95649: WORKER PROCESS EXITING 30575 1726867665.95695: no more pending results, returning what we have 30575 1726867665.95700: in VariableManager get_vars() 30575 1726867665.95739: Calling all_inventory to load vars for managed_node3 30575 1726867665.95741: Calling groups_inventory to load vars for managed_node3 30575 1726867665.95744: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.95755: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.95758: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.95760: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.96517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.97364: done with get_vars() 30575 1726867665.97380: variable 'ansible_search_path' from source: unknown 30575 1726867665.97381: variable 'ansible_search_path' from source: unknown 30575 1726867665.97405: we have included files to process 30575 1726867665.97406: generating all_blocks data 30575 1726867665.97407: done generating all_blocks data 30575 1726867665.97410: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867665.97411: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867665.97412: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30575 1726867665.97476: in VariableManager get_vars() 30575 1726867665.97490: done with get_vars() 30575 1726867665.97560: done processing included file 30575 1726867665.97561: iterating over new_blocks loaded from include file 30575 1726867665.97562: in VariableManager get_vars() 30575 1726867665.97572: done with get_vars() 30575 1726867665.97573: filtering new block on tags 30575 1726867665.97595: done filtering new block on tags 30575 1726867665.97597: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30575 1726867665.97600: extending task lists for all hosts with included blocks 30575 1726867665.97854: done extending task lists 30575 1726867665.97855: done processing included files 30575 1726867665.97856: results queue empty 30575 1726867665.97856: checking for any_errors_fatal 30575 1726867665.97859: done checking for any_errors_fatal 30575 1726867665.97860: checking for max_fail_percentage 30575 1726867665.97861: done checking for max_fail_percentage 30575 1726867665.97861: checking to see if all hosts have failed and the running result is not ok 30575 1726867665.97862: done checking to see if all hosts have failed 30575 1726867665.97862: getting the remaining hosts for this loop 30575 1726867665.97863: done getting the remaining hosts for this loop 30575 1726867665.97865: getting the next task for host managed_node3 30575 1726867665.97867: done getting next task for host managed_node3 30575 1726867665.97869: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30575 1726867665.97871: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867665.97873: getting variables 30575 1726867665.97873: in VariableManager get_vars() 30575 1726867665.97883: Calling all_inventory to load vars for managed_node3 30575 1726867665.97885: Calling groups_inventory to load vars for managed_node3 30575 1726867665.97886: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867665.97890: Calling all_plugins_play to load vars for managed_node3 30575 1726867665.97891: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867665.97893: Calling groups_plugins_play to load vars for managed_node3 30575 1726867665.98570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867665.99403: done with get_vars() 30575 1726867665.99417: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:27:45 -0400 (0:00:00.048) 0:01:41.372 ****** 30575 1726867665.99462: entering _queue_task() for managed_node3/include_tasks 30575 1726867665.99665: worker is 1 (out of 1 available) 30575 1726867665.99680: exiting _queue_task() for managed_node3/include_tasks 30575 1726867665.99693: done queuing things up, now waiting for results queue to drain 30575 1726867665.99695: waiting for pending results... 30575 1726867665.99870: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30575 1726867665.99945: in run() - task 0affcac9-a3a5-e081-a588-0000000020d6 30575 1726867665.99956: variable 'ansible_search_path' from source: unknown 30575 1726867665.99960: variable 'ansible_search_path' from source: unknown 30575 1726867665.99989: calling self._execute() 30575 1726867666.00057: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.00061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.00071: variable 'omit' from source: magic vars 30575 1726867666.00341: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.00351: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.00359: _execute() done 30575 1726867666.00362: dumping result to json 30575 1726867666.00366: done dumping result, returning 30575 1726867666.00373: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-e081-a588-0000000020d6] 30575 1726867666.00379: sending task result for task 0affcac9-a3a5-e081-a588-0000000020d6 30575 1726867666.00459: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020d6 30575 1726867666.00462: WORKER PROCESS EXITING 30575 1726867666.00494: no more pending results, returning what we have 30575 1726867666.00499: in VariableManager get_vars() 30575 1726867666.00548: Calling all_inventory to load vars for managed_node3 30575 1726867666.00551: Calling groups_inventory to load vars for managed_node3 30575 1726867666.00554: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.00565: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.00568: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.00570: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.01338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.02201: done with get_vars() 30575 1726867666.02214: variable 'ansible_search_path' from source: unknown 30575 1726867666.02215: variable 'ansible_search_path' from source: unknown 30575 1726867666.02240: we have included files to process 30575 1726867666.02241: generating all_blocks data 30575 1726867666.02242: done generating all_blocks data 30575 1726867666.02243: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867666.02243: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867666.02245: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30575 1726867666.02412: done processing included file 30575 1726867666.02413: iterating over new_blocks loaded from include file 30575 1726867666.02414: in VariableManager get_vars() 30575 1726867666.02427: done with get_vars() 30575 1726867666.02428: filtering new block on tags 30575 1726867666.02449: done filtering new block on tags 30575 1726867666.02450: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30575 1726867666.02453: extending task lists for all hosts with included blocks 30575 1726867666.02546: done extending task lists 30575 1726867666.02547: done processing included files 30575 1726867666.02548: results queue empty 30575 1726867666.02548: checking for any_errors_fatal 30575 1726867666.02550: done checking for any_errors_fatal 30575 1726867666.02551: checking for max_fail_percentage 30575 1726867666.02551: done checking for max_fail_percentage 30575 1726867666.02552: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.02553: done checking to see if all hosts have failed 30575 1726867666.02553: getting the remaining hosts for this loop 30575 1726867666.02554: done getting the remaining hosts for this loop 30575 1726867666.02555: getting the next task for host managed_node3 30575 1726867666.02558: done getting next task for host managed_node3 30575 1726867666.02560: ^ task is: TASK: Gather current interface info 30575 1726867666.02562: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.02564: getting variables 30575 1726867666.02564: in VariableManager get_vars() 30575 1726867666.02571: Calling all_inventory to load vars for managed_node3 30575 1726867666.02572: Calling groups_inventory to load vars for managed_node3 30575 1726867666.02574: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.02579: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.02581: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.02582: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.06671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.07510: done with get_vars() 30575 1726867666.07527: done getting variables 30575 1726867666.07553: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:27:46 -0400 (0:00:00.081) 0:01:41.453 ****** 30575 1726867666.07571: entering _queue_task() for managed_node3/command 30575 1726867666.07829: worker is 1 (out of 1 available) 30575 1726867666.07842: exiting _queue_task() for managed_node3/command 30575 1726867666.07853: done queuing things up, now waiting for results queue to drain 30575 1726867666.07854: waiting for pending results... 30575 1726867666.08035: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30575 1726867666.08132: in run() - task 0affcac9-a3a5-e081-a588-000000002111 30575 1726867666.08143: variable 'ansible_search_path' from source: unknown 30575 1726867666.08147: variable 'ansible_search_path' from source: unknown 30575 1726867666.08175: calling self._execute() 30575 1726867666.08250: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.08254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.08262: variable 'omit' from source: magic vars 30575 1726867666.08538: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.08548: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.08554: variable 'omit' from source: magic vars 30575 1726867666.08588: variable 'omit' from source: magic vars 30575 1726867666.08612: variable 'omit' from source: magic vars 30575 1726867666.08648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867666.08686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867666.08731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867666.08744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.08747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.08883: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867666.08887: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.08891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.08980: Set connection var ansible_pipelining to False 30575 1726867666.08984: Set connection var ansible_shell_type to sh 30575 1726867666.08991: Set connection var ansible_shell_executable to /bin/sh 30575 1726867666.08996: Set connection var ansible_timeout to 10 30575 1726867666.09002: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867666.09017: Set connection var ansible_connection to ssh 30575 1726867666.09034: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.09037: variable 'ansible_connection' from source: unknown 30575 1726867666.09040: variable 'ansible_module_compression' from source: unknown 30575 1726867666.09043: variable 'ansible_shell_type' from source: unknown 30575 1726867666.09045: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.09048: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.09050: variable 'ansible_pipelining' from source: unknown 30575 1726867666.09053: variable 'ansible_timeout' from source: unknown 30575 1726867666.09055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.09157: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867666.09167: variable 'omit' from source: magic vars 30575 1726867666.09173: starting attempt loop 30575 1726867666.09176: running the handler 30575 1726867666.09191: _low_level_execute_command(): starting 30575 1726867666.09197: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867666.09718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867666.09723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867666.09727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.09786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867666.09796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867666.09799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.09842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867666.11543: stdout chunk (state=3): >>>/root <<< 30575 1726867666.11773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867666.11776: stdout chunk (state=3): >>><<< 30575 1726867666.11781: stderr chunk (state=3): >>><<< 30575 1726867666.11784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867666.11787: _low_level_execute_command(): starting 30575 1726867666.11789: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314 `" && echo ansible-tmp-1726867666.1169436-35344-49994287507314="` echo /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314 `" ) && sleep 0' 30575 1726867666.12376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867666.12394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867666.12432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.12490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867666.12542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.12625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.12676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867666.14556: stdout chunk (state=3): >>>ansible-tmp-1726867666.1169436-35344-49994287507314=/root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314 <<< 30575 1726867666.14666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867666.14688: stderr chunk (state=3): >>><<< 30575 1726867666.14693: stdout chunk (state=3): >>><<< 30575 1726867666.14711: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867666.1169436-35344-49994287507314=/root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867666.14736: variable 'ansible_module_compression' from source: unknown 30575 1726867666.14776: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867666.14813: variable 'ansible_facts' from source: unknown 30575 1726867666.14864: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/AnsiballZ_command.py 30575 1726867666.15195: Sending initial data 30575 1726867666.15198: Sent initial data (155 bytes) 30575 1726867666.15570: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867666.15581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867666.15595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867666.15696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867666.15710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867666.15724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.15793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867666.17310: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30575 1726867666.17323: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30575 1726867666.17334: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30575 1726867666.17350: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867666.17421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867666.17468: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpb34wxkpy /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/AnsiballZ_command.py <<< 30575 1726867666.17473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/AnsiballZ_command.py" <<< 30575 1726867666.17516: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpb34wxkpy" to remote "/root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/AnsiballZ_command.py" <<< 30575 1726867666.18263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867666.18335: stderr chunk (state=3): >>><<< 30575 1726867666.18338: stdout chunk (state=3): >>><<< 30575 1726867666.18347: done transferring module to remote 30575 1726867666.18359: _low_level_execute_command(): starting 30575 1726867666.18368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/ /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/AnsiballZ_command.py && sleep 0' 30575 1726867666.18960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867666.18973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867666.18996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867666.19014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867666.19113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.19132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867666.19148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867666.19165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.19238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867666.21047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867666.21051: stdout chunk (state=3): >>><<< 30575 1726867666.21057: stderr chunk (state=3): >>><<< 30575 1726867666.21074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867666.21085: _low_level_execute_command(): starting 30575 1726867666.21097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/AnsiballZ_command.py && sleep 0' 30575 1726867666.21681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867666.21684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867666.21686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.21689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867666.21691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.21763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.21813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867666.37319: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:27:46.367580", "end": "2024-09-20 17:27:46.371057", "delta": "0:00:00.003477", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867666.38809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867666.38832: stderr chunk (state=3): >>><<< 30575 1726867666.38836: stdout chunk (state=3): >>><<< 30575 1726867666.38855: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:27:46.367580", "end": "2024-09-20 17:27:46.371057", "delta": "0:00:00.003477", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867666.38889: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867666.38895: _low_level_execute_command(): starting 30575 1726867666.38900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867666.1169436-35344-49994287507314/ > /dev/null 2>&1 && sleep 0' 30575 1726867666.39318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867666.39322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.39324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867666.39330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.39381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867666.39385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.39442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867666.41341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867666.41369: stderr chunk (state=3): >>><<< 30575 1726867666.41372: stdout chunk (state=3): >>><<< 30575 1726867666.41583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867666.41587: handler run complete 30575 1726867666.41589: Evaluated conditional (False): False 30575 1726867666.41592: attempt loop complete, returning result 30575 1726867666.41593: _execute() done 30575 1726867666.41595: dumping result to json 30575 1726867666.41597: done dumping result, returning 30575 1726867666.41599: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-e081-a588-000000002111] 30575 1726867666.41601: sending task result for task 0affcac9-a3a5-e081-a588-000000002111 30575 1726867666.41675: done sending task result for task 0affcac9-a3a5-e081-a588-000000002111 30575 1726867666.41685: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003477", "end": "2024-09-20 17:27:46.371057", "rc": 0, "start": "2024-09-20 17:27:46.367580" } STDOUT: bonding_masters eth0 lo 30575 1726867666.41768: no more pending results, returning what we have 30575 1726867666.41772: results queue empty 30575 1726867666.41773: checking for any_errors_fatal 30575 1726867666.41775: done checking for any_errors_fatal 30575 1726867666.41775: checking for max_fail_percentage 30575 1726867666.41779: done checking for max_fail_percentage 30575 1726867666.41780: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.41781: done checking to see if all hosts have failed 30575 1726867666.41782: getting the remaining hosts for this loop 30575 1726867666.41784: done getting the remaining hosts for this loop 30575 1726867666.41789: getting the next task for host managed_node3 30575 1726867666.41803: done getting next task for host managed_node3 30575 1726867666.41806: ^ task is: TASK: Set current_interfaces 30575 1726867666.41812: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.41820: getting variables 30575 1726867666.41822: in VariableManager get_vars() 30575 1726867666.41869: Calling all_inventory to load vars for managed_node3 30575 1726867666.41872: Calling groups_inventory to load vars for managed_node3 30575 1726867666.41875: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.42004: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.42008: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.42014: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.43729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.45648: done with get_vars() 30575 1726867666.45670: done getting variables 30575 1726867666.45742: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:27:46 -0400 (0:00:00.381) 0:01:41.835 ****** 30575 1726867666.45774: entering _queue_task() for managed_node3/set_fact 30575 1726867666.46298: worker is 1 (out of 1 available) 30575 1726867666.46308: exiting _queue_task() for managed_node3/set_fact 30575 1726867666.46321: done queuing things up, now waiting for results queue to drain 30575 1726867666.46323: waiting for pending results... 30575 1726867666.46565: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30575 1726867666.46645: in run() - task 0affcac9-a3a5-e081-a588-000000002112 30575 1726867666.46676: variable 'ansible_search_path' from source: unknown 30575 1726867666.46687: variable 'ansible_search_path' from source: unknown 30575 1726867666.46731: calling self._execute() 30575 1726867666.46844: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.46882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.46886: variable 'omit' from source: magic vars 30575 1726867666.47289: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.47385: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.47390: variable 'omit' from source: magic vars 30575 1726867666.47393: variable 'omit' from source: magic vars 30575 1726867666.47518: variable '_current_interfaces' from source: set_fact 30575 1726867666.47598: variable 'omit' from source: magic vars 30575 1726867666.47655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867666.47698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867666.47727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867666.47760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.47780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.47818: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867666.47829: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.47862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.47952: Set connection var ansible_pipelining to False 30575 1726867666.47968: Set connection var ansible_shell_type to sh 30575 1726867666.47989: Set connection var ansible_shell_executable to /bin/sh 30575 1726867666.48000: Set connection var ansible_timeout to 10 30575 1726867666.48079: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867666.48083: Set connection var ansible_connection to ssh 30575 1726867666.48085: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.48088: variable 'ansible_connection' from source: unknown 30575 1726867666.48091: variable 'ansible_module_compression' from source: unknown 30575 1726867666.48093: variable 'ansible_shell_type' from source: unknown 30575 1726867666.48095: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.48096: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.48098: variable 'ansible_pipelining' from source: unknown 30575 1726867666.48100: variable 'ansible_timeout' from source: unknown 30575 1726867666.48102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.48246: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867666.48282: variable 'omit' from source: magic vars 30575 1726867666.48285: starting attempt loop 30575 1726867666.48296: running the handler 30575 1726867666.48324: handler run complete 30575 1726867666.48328: attempt loop complete, returning result 30575 1726867666.48330: _execute() done 30575 1726867666.48332: dumping result to json 30575 1726867666.48338: done dumping result, returning 30575 1726867666.48348: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-e081-a588-000000002112] 30575 1726867666.48356: sending task result for task 0affcac9-a3a5-e081-a588-000000002112 30575 1726867666.48627: done sending task result for task 0affcac9-a3a5-e081-a588-000000002112 30575 1726867666.48630: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30575 1726867666.48683: no more pending results, returning what we have 30575 1726867666.48686: results queue empty 30575 1726867666.48687: checking for any_errors_fatal 30575 1726867666.48694: done checking for any_errors_fatal 30575 1726867666.48695: checking for max_fail_percentage 30575 1726867666.48697: done checking for max_fail_percentage 30575 1726867666.48698: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.48698: done checking to see if all hosts have failed 30575 1726867666.48699: getting the remaining hosts for this loop 30575 1726867666.48701: done getting the remaining hosts for this loop 30575 1726867666.48704: getting the next task for host managed_node3 30575 1726867666.48713: done getting next task for host managed_node3 30575 1726867666.48718: ^ task is: TASK: Show current_interfaces 30575 1726867666.48722: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.48725: getting variables 30575 1726867666.48726: in VariableManager get_vars() 30575 1726867666.48763: Calling all_inventory to load vars for managed_node3 30575 1726867666.48766: Calling groups_inventory to load vars for managed_node3 30575 1726867666.48769: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.48781: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.48784: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.48787: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.50213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.51660: done with get_vars() 30575 1726867666.51676: done getting variables 30575 1726867666.51720: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:27:46 -0400 (0:00:00.059) 0:01:41.895 ****** 30575 1726867666.51745: entering _queue_task() for managed_node3/debug 30575 1726867666.51984: worker is 1 (out of 1 available) 30575 1726867666.51998: exiting _queue_task() for managed_node3/debug 30575 1726867666.52010: done queuing things up, now waiting for results queue to drain 30575 1726867666.52012: waiting for pending results... 30575 1726867666.52205: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30575 1726867666.52295: in run() - task 0affcac9-a3a5-e081-a588-0000000020d7 30575 1726867666.52308: variable 'ansible_search_path' from source: unknown 30575 1726867666.52311: variable 'ansible_search_path' from source: unknown 30575 1726867666.52342: calling self._execute() 30575 1726867666.52427: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.52430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.52440: variable 'omit' from source: magic vars 30575 1726867666.52727: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.52736: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.52743: variable 'omit' from source: magic vars 30575 1726867666.52773: variable 'omit' from source: magic vars 30575 1726867666.52846: variable 'current_interfaces' from source: set_fact 30575 1726867666.52869: variable 'omit' from source: magic vars 30575 1726867666.52904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867666.52931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867666.52950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867666.52963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.52973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.53002: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867666.53005: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.53008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.53079: Set connection var ansible_pipelining to False 30575 1726867666.53083: Set connection var ansible_shell_type to sh 30575 1726867666.53093: Set connection var ansible_shell_executable to /bin/sh 30575 1726867666.53095: Set connection var ansible_timeout to 10 30575 1726867666.53098: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867666.53144: Set connection var ansible_connection to ssh 30575 1726867666.53147: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.53150: variable 'ansible_connection' from source: unknown 30575 1726867666.53152: variable 'ansible_module_compression' from source: unknown 30575 1726867666.53154: variable 'ansible_shell_type' from source: unknown 30575 1726867666.53157: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.53159: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.53165: variable 'ansible_pipelining' from source: unknown 30575 1726867666.53167: variable 'ansible_timeout' from source: unknown 30575 1726867666.53169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.53335: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867666.53349: variable 'omit' from source: magic vars 30575 1726867666.53584: starting attempt loop 30575 1726867666.53587: running the handler 30575 1726867666.53589: handler run complete 30575 1726867666.53591: attempt loop complete, returning result 30575 1726867666.53593: _execute() done 30575 1726867666.53594: dumping result to json 30575 1726867666.53596: done dumping result, returning 30575 1726867666.53598: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-e081-a588-0000000020d7] 30575 1726867666.53600: sending task result for task 0affcac9-a3a5-e081-a588-0000000020d7 30575 1726867666.53669: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020d7 30575 1726867666.53672: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30575 1726867666.53718: no more pending results, returning what we have 30575 1726867666.53722: results queue empty 30575 1726867666.53723: checking for any_errors_fatal 30575 1726867666.53728: done checking for any_errors_fatal 30575 1726867666.53729: checking for max_fail_percentage 30575 1726867666.53731: done checking for max_fail_percentage 30575 1726867666.53732: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.53733: done checking to see if all hosts have failed 30575 1726867666.53734: getting the remaining hosts for this loop 30575 1726867666.53735: done getting the remaining hosts for this loop 30575 1726867666.53739: getting the next task for host managed_node3 30575 1726867666.53749: done getting next task for host managed_node3 30575 1726867666.53753: ^ task is: TASK: Setup 30575 1726867666.53756: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.53761: getting variables 30575 1726867666.53763: in VariableManager get_vars() 30575 1726867666.53814: Calling all_inventory to load vars for managed_node3 30575 1726867666.53817: Calling groups_inventory to load vars for managed_node3 30575 1726867666.53821: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.53832: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.53835: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.53838: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.55236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.56082: done with get_vars() 30575 1726867666.56099: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:27:46 -0400 (0:00:00.044) 0:01:41.939 ****** 30575 1726867666.56160: entering _queue_task() for managed_node3/include_tasks 30575 1726867666.56376: worker is 1 (out of 1 available) 30575 1726867666.56391: exiting _queue_task() for managed_node3/include_tasks 30575 1726867666.56402: done queuing things up, now waiting for results queue to drain 30575 1726867666.56404: waiting for pending results... 30575 1726867666.56579: running TaskExecutor() for managed_node3/TASK: Setup 30575 1726867666.56655: in run() - task 0affcac9-a3a5-e081-a588-0000000020b0 30575 1726867666.56667: variable 'ansible_search_path' from source: unknown 30575 1726867666.56670: variable 'ansible_search_path' from source: unknown 30575 1726867666.56712: variable 'lsr_setup' from source: include params 30575 1726867666.56870: variable 'lsr_setup' from source: include params 30575 1726867666.56982: variable 'omit' from source: magic vars 30575 1726867666.57075: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.57095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.57115: variable 'omit' from source: magic vars 30575 1726867666.57359: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.57372: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.57388: variable 'item' from source: unknown 30575 1726867666.57452: variable 'item' from source: unknown 30575 1726867666.57489: variable 'item' from source: unknown 30575 1726867666.57782: variable 'item' from source: unknown 30575 1726867666.57888: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.57891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.57894: variable 'omit' from source: magic vars 30575 1726867666.57902: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.57918: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.57921: variable 'item' from source: unknown 30575 1726867666.57985: variable 'item' from source: unknown 30575 1726867666.58004: variable 'item' from source: unknown 30575 1726867666.58052: variable 'item' from source: unknown 30575 1726867666.58113: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.58120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.58126: variable 'omit' from source: magic vars 30575 1726867666.58232: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.58235: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.58238: variable 'item' from source: unknown 30575 1726867666.58284: variable 'item' from source: unknown 30575 1726867666.58303: variable 'item' from source: unknown 30575 1726867666.58347: variable 'item' from source: unknown 30575 1726867666.58404: dumping result to json 30575 1726867666.58407: done dumping result, returning 30575 1726867666.58409: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-e081-a588-0000000020b0] 30575 1726867666.58412: sending task result for task 0affcac9-a3a5-e081-a588-0000000020b0 30575 1726867666.58447: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020b0 30575 1726867666.58449: WORKER PROCESS EXITING 30575 1726867666.58484: no more pending results, returning what we have 30575 1726867666.58489: in VariableManager get_vars() 30575 1726867666.58539: Calling all_inventory to load vars for managed_node3 30575 1726867666.58541: Calling groups_inventory to load vars for managed_node3 30575 1726867666.58545: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.58555: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.58560: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.58563: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.59345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.60211: done with get_vars() 30575 1726867666.60226: variable 'ansible_search_path' from source: unknown 30575 1726867666.60227: variable 'ansible_search_path' from source: unknown 30575 1726867666.60252: variable 'ansible_search_path' from source: unknown 30575 1726867666.60253: variable 'ansible_search_path' from source: unknown 30575 1726867666.60269: variable 'ansible_search_path' from source: unknown 30575 1726867666.60270: variable 'ansible_search_path' from source: unknown 30575 1726867666.60289: we have included files to process 30575 1726867666.60290: generating all_blocks data 30575 1726867666.60292: done generating all_blocks data 30575 1726867666.60295: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867666.60296: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867666.60297: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30575 1726867666.60447: done processing included file 30575 1726867666.60448: iterating over new_blocks loaded from include file 30575 1726867666.60449: in VariableManager get_vars() 30575 1726867666.60458: done with get_vars() 30575 1726867666.60459: filtering new block on tags 30575 1726867666.60484: done filtering new block on tags 30575 1726867666.60486: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30575 1726867666.60488: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867666.60489: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867666.60491: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30575 1726867666.60549: done processing included file 30575 1726867666.60551: iterating over new_blocks loaded from include file 30575 1726867666.60551: in VariableManager get_vars() 30575 1726867666.60561: done with get_vars() 30575 1726867666.60562: filtering new block on tags 30575 1726867666.60576: done filtering new block on tags 30575 1726867666.60579: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30575 1726867666.60581: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867666.60582: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867666.60583: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867666.60641: done processing included file 30575 1726867666.60642: iterating over new_blocks loaded from include file 30575 1726867666.60643: in VariableManager get_vars() 30575 1726867666.60653: done with get_vars() 30575 1726867666.60654: filtering new block on tags 30575 1726867666.60666: done filtering new block on tags 30575 1726867666.60668: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node3 => (item=tasks/remove+down_profile.yml) 30575 1726867666.60670: extending task lists for all hosts with included blocks 30575 1726867666.61058: done extending task lists 30575 1726867666.61059: done processing included files 30575 1726867666.61059: results queue empty 30575 1726867666.61060: checking for any_errors_fatal 30575 1726867666.61062: done checking for any_errors_fatal 30575 1726867666.61063: checking for max_fail_percentage 30575 1726867666.61064: done checking for max_fail_percentage 30575 1726867666.61064: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.61065: done checking to see if all hosts have failed 30575 1726867666.61065: getting the remaining hosts for this loop 30575 1726867666.61066: done getting the remaining hosts for this loop 30575 1726867666.61068: getting the next task for host managed_node3 30575 1726867666.61070: done getting next task for host managed_node3 30575 1726867666.61071: ^ task is: TASK: Include network role 30575 1726867666.61073: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.61075: getting variables 30575 1726867666.61076: in VariableManager get_vars() 30575 1726867666.61084: Calling all_inventory to load vars for managed_node3 30575 1726867666.61085: Calling groups_inventory to load vars for managed_node3 30575 1726867666.61086: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.61090: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.61091: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.61093: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.61747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.62590: done with get_vars() 30575 1726867666.62604: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 17:27:46 -0400 (0:00:00.064) 0:01:42.004 ****** 30575 1726867666.62651: entering _queue_task() for managed_node3/include_role 30575 1726867666.62851: worker is 1 (out of 1 available) 30575 1726867666.62865: exiting _queue_task() for managed_node3/include_role 30575 1726867666.62880: done queuing things up, now waiting for results queue to drain 30575 1726867666.62882: waiting for pending results... 30575 1726867666.63063: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867666.63143: in run() - task 0affcac9-a3a5-e081-a588-000000002139 30575 1726867666.63155: variable 'ansible_search_path' from source: unknown 30575 1726867666.63159: variable 'ansible_search_path' from source: unknown 30575 1726867666.63188: calling self._execute() 30575 1726867666.63263: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.63266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.63276: variable 'omit' from source: magic vars 30575 1726867666.63554: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.63563: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.63569: _execute() done 30575 1726867666.63572: dumping result to json 30575 1726867666.63578: done dumping result, returning 30575 1726867666.63584: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-000000002139] 30575 1726867666.63589: sending task result for task 0affcac9-a3a5-e081-a588-000000002139 30575 1726867666.63689: done sending task result for task 0affcac9-a3a5-e081-a588-000000002139 30575 1726867666.63692: WORKER PROCESS EXITING 30575 1726867666.63719: no more pending results, returning what we have 30575 1726867666.63724: in VariableManager get_vars() 30575 1726867666.63770: Calling all_inventory to load vars for managed_node3 30575 1726867666.63772: Calling groups_inventory to load vars for managed_node3 30575 1726867666.63775: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.63791: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.63794: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.63797: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.64541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.65495: done with get_vars() 30575 1726867666.65510: variable 'ansible_search_path' from source: unknown 30575 1726867666.65511: variable 'ansible_search_path' from source: unknown 30575 1726867666.65619: variable 'omit' from source: magic vars 30575 1726867666.65642: variable 'omit' from source: magic vars 30575 1726867666.65651: variable 'omit' from source: magic vars 30575 1726867666.65653: we have included files to process 30575 1726867666.65654: generating all_blocks data 30575 1726867666.65655: done generating all_blocks data 30575 1726867666.65656: processing included file: fedora.linux_system_roles.network 30575 1726867666.65668: in VariableManager get_vars() 30575 1726867666.65676: done with get_vars() 30575 1726867666.65695: in VariableManager get_vars() 30575 1726867666.65707: done with get_vars() 30575 1726867666.65735: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867666.65802: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867666.65852: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867666.66108: in VariableManager get_vars() 30575 1726867666.66123: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867666.67317: iterating over new_blocks loaded from include file 30575 1726867666.67319: in VariableManager get_vars() 30575 1726867666.67332: done with get_vars() 30575 1726867666.67333: filtering new block on tags 30575 1726867666.67494: done filtering new block on tags 30575 1726867666.67497: in VariableManager get_vars() 30575 1726867666.67507: done with get_vars() 30575 1726867666.67508: filtering new block on tags 30575 1726867666.67519: done filtering new block on tags 30575 1726867666.67520: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867666.67523: extending task lists for all hosts with included blocks 30575 1726867666.67617: done extending task lists 30575 1726867666.67618: done processing included files 30575 1726867666.67618: results queue empty 30575 1726867666.67619: checking for any_errors_fatal 30575 1726867666.67621: done checking for any_errors_fatal 30575 1726867666.67622: checking for max_fail_percentage 30575 1726867666.67622: done checking for max_fail_percentage 30575 1726867666.67623: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.67624: done checking to see if all hosts have failed 30575 1726867666.67624: getting the remaining hosts for this loop 30575 1726867666.67625: done getting the remaining hosts for this loop 30575 1726867666.67627: getting the next task for host managed_node3 30575 1726867666.67630: done getting next task for host managed_node3 30575 1726867666.67631: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867666.67633: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.67640: getting variables 30575 1726867666.67641: in VariableManager get_vars() 30575 1726867666.67650: Calling all_inventory to load vars for managed_node3 30575 1726867666.67651: Calling groups_inventory to load vars for managed_node3 30575 1726867666.67653: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.67656: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.67658: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.67661: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.68349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.69307: done with get_vars() 30575 1726867666.69324: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:46 -0400 (0:00:00.067) 0:01:42.071 ****** 30575 1726867666.69380: entering _queue_task() for managed_node3/include_tasks 30575 1726867666.69663: worker is 1 (out of 1 available) 30575 1726867666.69676: exiting _queue_task() for managed_node3/include_tasks 30575 1726867666.69690: done queuing things up, now waiting for results queue to drain 30575 1726867666.69692: waiting for pending results... 30575 1726867666.69896: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867666.69992: in run() - task 0affcac9-a3a5-e081-a588-0000000021a3 30575 1726867666.70004: variable 'ansible_search_path' from source: unknown 30575 1726867666.70009: variable 'ansible_search_path' from source: unknown 30575 1726867666.70045: calling self._execute() 30575 1726867666.70122: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.70128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.70184: variable 'omit' from source: magic vars 30575 1726867666.70428: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.70438: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.70443: _execute() done 30575 1726867666.70446: dumping result to json 30575 1726867666.70452: done dumping result, returning 30575 1726867666.70457: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-0000000021a3] 30575 1726867666.70466: sending task result for task 0affcac9-a3a5-e081-a588-0000000021a3 30575 1726867666.70548: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021a3 30575 1726867666.70551: WORKER PROCESS EXITING 30575 1726867666.70611: no more pending results, returning what we have 30575 1726867666.70616: in VariableManager get_vars() 30575 1726867666.70673: Calling all_inventory to load vars for managed_node3 30575 1726867666.70675: Calling groups_inventory to load vars for managed_node3 30575 1726867666.70679: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.70690: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.70693: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.70695: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.71513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.72379: done with get_vars() 30575 1726867666.72394: variable 'ansible_search_path' from source: unknown 30575 1726867666.72395: variable 'ansible_search_path' from source: unknown 30575 1726867666.72424: we have included files to process 30575 1726867666.72425: generating all_blocks data 30575 1726867666.72426: done generating all_blocks data 30575 1726867666.72429: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867666.72429: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867666.72431: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867666.72803: done processing included file 30575 1726867666.72805: iterating over new_blocks loaded from include file 30575 1726867666.72806: in VariableManager get_vars() 30575 1726867666.72823: done with get_vars() 30575 1726867666.72825: filtering new block on tags 30575 1726867666.72845: done filtering new block on tags 30575 1726867666.72847: in VariableManager get_vars() 30575 1726867666.72862: done with get_vars() 30575 1726867666.72863: filtering new block on tags 30575 1726867666.72891: done filtering new block on tags 30575 1726867666.72893: in VariableManager get_vars() 30575 1726867666.72906: done with get_vars() 30575 1726867666.72907: filtering new block on tags 30575 1726867666.72930: done filtering new block on tags 30575 1726867666.72932: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867666.72936: extending task lists for all hosts with included blocks 30575 1726867666.73880: done extending task lists 30575 1726867666.73881: done processing included files 30575 1726867666.73882: results queue empty 30575 1726867666.73882: checking for any_errors_fatal 30575 1726867666.73885: done checking for any_errors_fatal 30575 1726867666.73885: checking for max_fail_percentage 30575 1726867666.73886: done checking for max_fail_percentage 30575 1726867666.73887: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.73887: done checking to see if all hosts have failed 30575 1726867666.73888: getting the remaining hosts for this loop 30575 1726867666.73889: done getting the remaining hosts for this loop 30575 1726867666.73891: getting the next task for host managed_node3 30575 1726867666.73894: done getting next task for host managed_node3 30575 1726867666.73896: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867666.73898: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.73905: getting variables 30575 1726867666.73906: in VariableManager get_vars() 30575 1726867666.73922: Calling all_inventory to load vars for managed_node3 30575 1726867666.73925: Calling groups_inventory to load vars for managed_node3 30575 1726867666.73927: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.73932: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.73934: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.73940: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.74608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.75496: done with get_vars() 30575 1726867666.75516: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:46 -0400 (0:00:00.062) 0:01:42.133 ****** 30575 1726867666.75585: entering _queue_task() for managed_node3/setup 30575 1726867666.75949: worker is 1 (out of 1 available) 30575 1726867666.75963: exiting _queue_task() for managed_node3/setup 30575 1726867666.75975: done queuing things up, now waiting for results queue to drain 30575 1726867666.75979: waiting for pending results... 30575 1726867666.76307: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867666.76439: in run() - task 0affcac9-a3a5-e081-a588-000000002200 30575 1726867666.76462: variable 'ansible_search_path' from source: unknown 30575 1726867666.76511: variable 'ansible_search_path' from source: unknown 30575 1726867666.76518: calling self._execute() 30575 1726867666.76618: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.76623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.76636: variable 'omit' from source: magic vars 30575 1726867666.76928: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.76938: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.77087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867666.78783: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867666.78786: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867666.78813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867666.78853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867666.78886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867666.78961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867666.78995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867666.79032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867666.79079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867666.79099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867666.79160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867666.79190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867666.79215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867666.79260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867666.79279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867666.79419: variable '__network_required_facts' from source: role '' defaults 30575 1726867666.79455: variable 'ansible_facts' from source: unknown 30575 1726867666.80198: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867666.80207: when evaluation is False, skipping this task 30575 1726867666.80222: _execute() done 30575 1726867666.80326: dumping result to json 30575 1726867666.80330: done dumping result, returning 30575 1726867666.80333: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-000000002200] 30575 1726867666.80335: sending task result for task 0affcac9-a3a5-e081-a588-000000002200 30575 1726867666.80401: done sending task result for task 0affcac9-a3a5-e081-a588-000000002200 30575 1726867666.80405: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867666.80476: no more pending results, returning what we have 30575 1726867666.80481: results queue empty 30575 1726867666.80482: checking for any_errors_fatal 30575 1726867666.80484: done checking for any_errors_fatal 30575 1726867666.80485: checking for max_fail_percentage 30575 1726867666.80487: done checking for max_fail_percentage 30575 1726867666.80488: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.80489: done checking to see if all hosts have failed 30575 1726867666.80489: getting the remaining hosts for this loop 30575 1726867666.80491: done getting the remaining hosts for this loop 30575 1726867666.80495: getting the next task for host managed_node3 30575 1726867666.80508: done getting next task for host managed_node3 30575 1726867666.80513: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867666.80519: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.80588: getting variables 30575 1726867666.80590: in VariableManager get_vars() 30575 1726867666.80638: Calling all_inventory to load vars for managed_node3 30575 1726867666.80640: Calling groups_inventory to load vars for managed_node3 30575 1726867666.80643: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.80766: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.80770: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.80780: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.82216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.83813: done with get_vars() 30575 1726867666.83836: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:46 -0400 (0:00:00.083) 0:01:42.216 ****** 30575 1726867666.83940: entering _queue_task() for managed_node3/stat 30575 1726867666.84271: worker is 1 (out of 1 available) 30575 1726867666.84289: exiting _queue_task() for managed_node3/stat 30575 1726867666.84304: done queuing things up, now waiting for results queue to drain 30575 1726867666.84306: waiting for pending results... 30575 1726867666.84575: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867666.84760: in run() - task 0affcac9-a3a5-e081-a588-000000002202 30575 1726867666.84785: variable 'ansible_search_path' from source: unknown 30575 1726867666.84794: variable 'ansible_search_path' from source: unknown 30575 1726867666.84842: calling self._execute() 30575 1726867666.84946: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.84958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.84973: variable 'omit' from source: magic vars 30575 1726867666.85374: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.85395: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.85572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867666.85912: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867666.85963: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867666.86009: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867666.86049: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867666.86193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867666.86196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867666.86203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867666.86238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867666.86334: variable '__network_is_ostree' from source: set_fact 30575 1726867666.86346: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867666.86407: when evaluation is False, skipping this task 30575 1726867666.86411: _execute() done 30575 1726867666.86414: dumping result to json 30575 1726867666.86418: done dumping result, returning 30575 1726867666.86421: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000002202] 30575 1726867666.86429: sending task result for task 0affcac9-a3a5-e081-a588-000000002202 30575 1726867666.86502: done sending task result for task 0affcac9-a3a5-e081-a588-000000002202 30575 1726867666.86505: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867666.86583: no more pending results, returning what we have 30575 1726867666.86586: results queue empty 30575 1726867666.86587: checking for any_errors_fatal 30575 1726867666.86600: done checking for any_errors_fatal 30575 1726867666.86601: checking for max_fail_percentage 30575 1726867666.86602: done checking for max_fail_percentage 30575 1726867666.86604: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.86605: done checking to see if all hosts have failed 30575 1726867666.86606: getting the remaining hosts for this loop 30575 1726867666.86607: done getting the remaining hosts for this loop 30575 1726867666.86611: getting the next task for host managed_node3 30575 1726867666.86621: done getting next task for host managed_node3 30575 1726867666.86626: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867666.86631: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.86650: getting variables 30575 1726867666.86651: in VariableManager get_vars() 30575 1726867666.86691: Calling all_inventory to load vars for managed_node3 30575 1726867666.86694: Calling groups_inventory to load vars for managed_node3 30575 1726867666.86696: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.86705: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.86707: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.86710: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.87607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.88705: done with get_vars() 30575 1726867666.88725: done getting variables 30575 1726867666.88779: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:46 -0400 (0:00:00.048) 0:01:42.265 ****** 30575 1726867666.88813: entering _queue_task() for managed_node3/set_fact 30575 1726867666.89091: worker is 1 (out of 1 available) 30575 1726867666.89103: exiting _queue_task() for managed_node3/set_fact 30575 1726867666.89118: done queuing things up, now waiting for results queue to drain 30575 1726867666.89119: waiting for pending results... 30575 1726867666.89406: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867666.89507: in run() - task 0affcac9-a3a5-e081-a588-000000002203 30575 1726867666.89521: variable 'ansible_search_path' from source: unknown 30575 1726867666.89525: variable 'ansible_search_path' from source: unknown 30575 1726867666.89551: calling self._execute() 30575 1726867666.89625: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.89629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.89639: variable 'omit' from source: magic vars 30575 1726867666.89903: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.89912: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.90026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867666.90226: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867666.90252: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867666.90276: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867666.90303: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867666.90366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867666.90386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867666.90404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867666.90422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867666.90489: variable '__network_is_ostree' from source: set_fact 30575 1726867666.90494: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867666.90497: when evaluation is False, skipping this task 30575 1726867666.90500: _execute() done 30575 1726867666.90503: dumping result to json 30575 1726867666.90508: done dumping result, returning 30575 1726867666.90517: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000002203] 30575 1726867666.90520: sending task result for task 0affcac9-a3a5-e081-a588-000000002203 30575 1726867666.90598: done sending task result for task 0affcac9-a3a5-e081-a588-000000002203 30575 1726867666.90600: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867666.90649: no more pending results, returning what we have 30575 1726867666.90653: results queue empty 30575 1726867666.90653: checking for any_errors_fatal 30575 1726867666.90662: done checking for any_errors_fatal 30575 1726867666.90662: checking for max_fail_percentage 30575 1726867666.90664: done checking for max_fail_percentage 30575 1726867666.90665: checking to see if all hosts have failed and the running result is not ok 30575 1726867666.90666: done checking to see if all hosts have failed 30575 1726867666.90666: getting the remaining hosts for this loop 30575 1726867666.90668: done getting the remaining hosts for this loop 30575 1726867666.90671: getting the next task for host managed_node3 30575 1726867666.90684: done getting next task for host managed_node3 30575 1726867666.90687: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867666.90692: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867666.90708: getting variables 30575 1726867666.90719: in VariableManager get_vars() 30575 1726867666.90754: Calling all_inventory to load vars for managed_node3 30575 1726867666.90756: Calling groups_inventory to load vars for managed_node3 30575 1726867666.90759: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867666.90766: Calling all_plugins_play to load vars for managed_node3 30575 1726867666.90769: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867666.90771: Calling groups_plugins_play to load vars for managed_node3 30575 1726867666.91801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867666.93157: done with get_vars() 30575 1726867666.93172: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:46 -0400 (0:00:00.044) 0:01:42.309 ****** 30575 1726867666.93239: entering _queue_task() for managed_node3/service_facts 30575 1726867666.93440: worker is 1 (out of 1 available) 30575 1726867666.93454: exiting _queue_task() for managed_node3/service_facts 30575 1726867666.93468: done queuing things up, now waiting for results queue to drain 30575 1726867666.93470: waiting for pending results... 30575 1726867666.93659: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867666.93760: in run() - task 0affcac9-a3a5-e081-a588-000000002205 30575 1726867666.93774: variable 'ansible_search_path' from source: unknown 30575 1726867666.93778: variable 'ansible_search_path' from source: unknown 30575 1726867666.93811: calling self._execute() 30575 1726867666.93882: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.93887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.93896: variable 'omit' from source: magic vars 30575 1726867666.94191: variable 'ansible_distribution_major_version' from source: facts 30575 1726867666.94226: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867666.94229: variable 'omit' from source: magic vars 30575 1726867666.94310: variable 'omit' from source: magic vars 30575 1726867666.94314: variable 'omit' from source: magic vars 30575 1726867666.94352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867666.94394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867666.94582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867666.94587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.94590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867666.94592: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867666.94595: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.94597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.94604: Set connection var ansible_pipelining to False 30575 1726867666.94611: Set connection var ansible_shell_type to sh 30575 1726867666.94613: Set connection var ansible_shell_executable to /bin/sh 30575 1726867666.94627: Set connection var ansible_timeout to 10 30575 1726867666.94638: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867666.94648: Set connection var ansible_connection to ssh 30575 1726867666.94673: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.94683: variable 'ansible_connection' from source: unknown 30575 1726867666.94690: variable 'ansible_module_compression' from source: unknown 30575 1726867666.94697: variable 'ansible_shell_type' from source: unknown 30575 1726867666.94702: variable 'ansible_shell_executable' from source: unknown 30575 1726867666.94708: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867666.94714: variable 'ansible_pipelining' from source: unknown 30575 1726867666.94725: variable 'ansible_timeout' from source: unknown 30575 1726867666.94731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867666.94940: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867666.94962: variable 'omit' from source: magic vars 30575 1726867666.94970: starting attempt loop 30575 1726867666.94973: running the handler 30575 1726867666.94992: _low_level_execute_command(): starting 30575 1726867666.94999: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867666.95731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867666.95751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867666.95793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.95839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867666.97551: stdout chunk (state=3): >>>/root <<< 30575 1726867666.97654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867666.97678: stderr chunk (state=3): >>><<< 30575 1726867666.97682: stdout chunk (state=3): >>><<< 30575 1726867666.97698: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867666.97708: _low_level_execute_command(): starting 30575 1726867666.97714: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948 `" && echo ansible-tmp-1726867666.976976-35379-5588034484948="` echo /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948 `" ) && sleep 0' 30575 1726867666.98295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867666.98356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867666.98372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867666.98396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867666.98513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867667.00364: stdout chunk (state=3): >>>ansible-tmp-1726867666.976976-35379-5588034484948=/root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948 <<< 30575 1726867667.00476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867667.00498: stderr chunk (state=3): >>><<< 30575 1726867667.00501: stdout chunk (state=3): >>><<< 30575 1726867667.00512: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867666.976976-35379-5588034484948=/root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867667.00546: variable 'ansible_module_compression' from source: unknown 30575 1726867667.00580: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867667.00610: variable 'ansible_facts' from source: unknown 30575 1726867667.00659: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/AnsiballZ_service_facts.py 30575 1726867667.00750: Sending initial data 30575 1726867667.00753: Sent initial data (159 bytes) 30575 1726867667.01163: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867667.01166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867667.01169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867667.01171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867667.01173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867667.01231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867667.01234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867667.01271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867667.02811: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867667.02821: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867667.02854: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867667.02903: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpo9xz2r4w /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/AnsiballZ_service_facts.py <<< 30575 1726867667.02906: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/AnsiballZ_service_facts.py" <<< 30575 1726867667.02948: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpo9xz2r4w" to remote "/root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/AnsiballZ_service_facts.py" <<< 30575 1726867667.02951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/AnsiballZ_service_facts.py" <<< 30575 1726867667.03532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867667.03539: stderr chunk (state=3): >>><<< 30575 1726867667.03542: stdout chunk (state=3): >>><<< 30575 1726867667.03580: done transferring module to remote 30575 1726867667.03589: _low_level_execute_command(): starting 30575 1726867667.03594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/ /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/AnsiballZ_service_facts.py && sleep 0' 30575 1726867667.04002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867667.04006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867667.04010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867667.04012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867667.04014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867667.04062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867667.04072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867667.04112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867667.05855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867667.05873: stderr chunk (state=3): >>><<< 30575 1726867667.05878: stdout chunk (state=3): >>><<< 30575 1726867667.05889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867667.05893: _low_level_execute_command(): starting 30575 1726867667.05896: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/AnsiballZ_service_facts.py && sleep 0' 30575 1726867667.06282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867667.06285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867667.06288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867667.06302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867667.06337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867667.06349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867667.06405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867668.56337: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30575 1726867668.56396: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867668.57907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867668.58083: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 30575 1726867668.58087: stdout chunk (state=3): >>><<< 30575 1726867668.58090: stderr chunk (state=3): >>><<< 30575 1726867668.58094: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867668.58814: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867668.58833: _low_level_execute_command(): starting 30575 1726867668.58845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867666.976976-35379-5588034484948/ > /dev/null 2>&1 && sleep 0' 30575 1726867668.59532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867668.59552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867668.59643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867668.59663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867668.59701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867668.59722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867668.59751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867668.59833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867668.61728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867668.61753: stdout chunk (state=3): >>><<< 30575 1726867668.61755: stderr chunk (state=3): >>><<< 30575 1726867668.61768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867668.61778: handler run complete 30575 1726867668.62089: variable 'ansible_facts' from source: unknown 30575 1726867668.62145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867668.62680: variable 'ansible_facts' from source: unknown 30575 1726867668.62824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867668.63044: attempt loop complete, returning result 30575 1726867668.63055: _execute() done 30575 1726867668.63065: dumping result to json 30575 1726867668.63136: done dumping result, returning 30575 1726867668.63151: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000002205] 30575 1726867668.63162: sending task result for task 0affcac9-a3a5-e081-a588-000000002205 30575 1726867668.64389: done sending task result for task 0affcac9-a3a5-e081-a588-000000002205 30575 1726867668.64392: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867668.64486: no more pending results, returning what we have 30575 1726867668.64489: results queue empty 30575 1726867668.64490: checking for any_errors_fatal 30575 1726867668.64494: done checking for any_errors_fatal 30575 1726867668.64495: checking for max_fail_percentage 30575 1726867668.64496: done checking for max_fail_percentage 30575 1726867668.64497: checking to see if all hosts have failed and the running result is not ok 30575 1726867668.64498: done checking to see if all hosts have failed 30575 1726867668.64499: getting the remaining hosts for this loop 30575 1726867668.64500: done getting the remaining hosts for this loop 30575 1726867668.64619: getting the next task for host managed_node3 30575 1726867668.64627: done getting next task for host managed_node3 30575 1726867668.64631: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867668.64636: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867668.64649: getting variables 30575 1726867668.64651: in VariableManager get_vars() 30575 1726867668.64737: Calling all_inventory to load vars for managed_node3 30575 1726867668.64740: Calling groups_inventory to load vars for managed_node3 30575 1726867668.64742: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867668.64750: Calling all_plugins_play to load vars for managed_node3 30575 1726867668.64753: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867668.64756: Calling groups_plugins_play to load vars for managed_node3 30575 1726867668.66223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867668.67999: done with get_vars() 30575 1726867668.68021: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:48 -0400 (0:00:01.748) 0:01:44.058 ****** 30575 1726867668.68130: entering _queue_task() for managed_node3/package_facts 30575 1726867668.68594: worker is 1 (out of 1 available) 30575 1726867668.68607: exiting _queue_task() for managed_node3/package_facts 30575 1726867668.68622: done queuing things up, now waiting for results queue to drain 30575 1726867668.68623: waiting for pending results... 30575 1726867668.68999: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867668.69079: in run() - task 0affcac9-a3a5-e081-a588-000000002206 30575 1726867668.69084: variable 'ansible_search_path' from source: unknown 30575 1726867668.69088: variable 'ansible_search_path' from source: unknown 30575 1726867668.69090: calling self._execute() 30575 1726867668.69190: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867668.69205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867668.69224: variable 'omit' from source: magic vars 30575 1726867668.69624: variable 'ansible_distribution_major_version' from source: facts 30575 1726867668.69645: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867668.69660: variable 'omit' from source: magic vars 30575 1726867668.69751: variable 'omit' from source: magic vars 30575 1726867668.69797: variable 'omit' from source: magic vars 30575 1726867668.69860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867668.69893: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867668.69917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867668.69938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867668.69968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867668.70001: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867668.70078: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867668.70082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867668.70130: Set connection var ansible_pipelining to False 30575 1726867668.70139: Set connection var ansible_shell_type to sh 30575 1726867668.70149: Set connection var ansible_shell_executable to /bin/sh 30575 1726867668.70183: Set connection var ansible_timeout to 10 30575 1726867668.70186: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867668.70188: Set connection var ansible_connection to ssh 30575 1726867668.70215: variable 'ansible_shell_executable' from source: unknown 30575 1726867668.70224: variable 'ansible_connection' from source: unknown 30575 1726867668.70231: variable 'ansible_module_compression' from source: unknown 30575 1726867668.70292: variable 'ansible_shell_type' from source: unknown 30575 1726867668.70296: variable 'ansible_shell_executable' from source: unknown 30575 1726867668.70298: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867668.70305: variable 'ansible_pipelining' from source: unknown 30575 1726867668.70307: variable 'ansible_timeout' from source: unknown 30575 1726867668.70310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867668.70483: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867668.70500: variable 'omit' from source: magic vars 30575 1726867668.70513: starting attempt loop 30575 1726867668.70524: running the handler 30575 1726867668.70546: _low_level_execute_command(): starting 30575 1726867668.70558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867668.71319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867668.71402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867668.71452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867668.71474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867668.71548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867668.73194: stdout chunk (state=3): >>>/root <<< 30575 1726867668.73325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867668.73338: stdout chunk (state=3): >>><<< 30575 1726867668.73451: stderr chunk (state=3): >>><<< 30575 1726867668.73455: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867668.73458: _low_level_execute_command(): starting 30575 1726867668.73461: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946 `" && echo ansible-tmp-1726867668.733718-35432-90235437103946="` echo /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946 `" ) && sleep 0' 30575 1726867668.74088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867668.74113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867668.74129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867668.74211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867668.76100: stdout chunk (state=3): >>>ansible-tmp-1726867668.733718-35432-90235437103946=/root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946 <<< 30575 1726867668.76254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867668.76257: stdout chunk (state=3): >>><<< 30575 1726867668.76259: stderr chunk (state=3): >>><<< 30575 1726867668.76281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867668.733718-35432-90235437103946=/root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867668.76482: variable 'ansible_module_compression' from source: unknown 30575 1726867668.76485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867668.76487: variable 'ansible_facts' from source: unknown 30575 1726867668.76632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/AnsiballZ_package_facts.py 30575 1726867668.76845: Sending initial data 30575 1726867668.76848: Sent initial data (160 bytes) 30575 1726867668.77370: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867668.77391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867668.77407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867668.77494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867668.77527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867668.77549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867668.77561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867668.77638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867668.79205: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867668.79284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867668.79330: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkk123w9x /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/AnsiballZ_package_facts.py <<< 30575 1726867668.79353: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/AnsiballZ_package_facts.py" <<< 30575 1726867668.79395: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkk123w9x" to remote "/root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/AnsiballZ_package_facts.py" <<< 30575 1726867668.80982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867668.81026: stderr chunk (state=3): >>><<< 30575 1726867668.81037: stdout chunk (state=3): >>><<< 30575 1726867668.81151: done transferring module to remote 30575 1726867668.81155: _low_level_execute_command(): starting 30575 1726867668.81157: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/ /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/AnsiballZ_package_facts.py && sleep 0' 30575 1726867668.81739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867668.81755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867668.81769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867668.81788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867668.81802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867668.81832: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867668.81920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867668.81946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867668.82014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867668.83840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867668.83847: stdout chunk (state=3): >>><<< 30575 1726867668.83854: stderr chunk (state=3): >>><<< 30575 1726867668.83865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867668.83870: _low_level_execute_command(): starting 30575 1726867668.83883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/AnsiballZ_package_facts.py && sleep 0' 30575 1726867668.84301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867668.84304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867668.84307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867668.84309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867668.84311: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867668.84358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867668.84361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867668.84419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867669.28194: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30575 1726867669.28209: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30575 1726867669.28232: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30575 1726867669.28262: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30575 1726867669.28278: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867669.28289: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30575 1726867669.28315: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30575 1726867669.28341: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867669.28360: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30575 1726867669.28366: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30575 1726867669.28397: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867669.28411: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30575 1726867669.28420: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867669.30183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867669.30205: stderr chunk (state=3): >>><<< 30575 1726867669.30208: stdout chunk (state=3): >>><<< 30575 1726867669.30244: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867669.31835: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867669.31851: _low_level_execute_command(): starting 30575 1726867669.31861: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867668.733718-35432-90235437103946/ > /dev/null 2>&1 && sleep 0' 30575 1726867669.32492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867669.32582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867669.32606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867669.32670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867669.32707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867669.34609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867669.34627: stdout chunk (state=3): >>><<< 30575 1726867669.34649: stderr chunk (state=3): >>><<< 30575 1726867669.34667: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867669.34680: handler run complete 30575 1726867669.35783: variable 'ansible_facts' from source: unknown 30575 1726867669.36196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.38308: variable 'ansible_facts' from source: unknown 30575 1726867669.38771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.39542: attempt loop complete, returning result 30575 1726867669.39561: _execute() done 30575 1726867669.39568: dumping result to json 30575 1726867669.39801: done dumping result, returning 30575 1726867669.39820: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000002206] 30575 1726867669.39832: sending task result for task 0affcac9-a3a5-e081-a588-000000002206 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867669.48754: done sending task result for task 0affcac9-a3a5-e081-a588-000000002206 30575 1726867669.48760: WORKER PROCESS EXITING 30575 1726867669.48769: no more pending results, returning what we have 30575 1726867669.48771: results queue empty 30575 1726867669.48771: checking for any_errors_fatal 30575 1726867669.48775: done checking for any_errors_fatal 30575 1726867669.48775: checking for max_fail_percentage 30575 1726867669.48776: done checking for max_fail_percentage 30575 1726867669.48778: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.48779: done checking to see if all hosts have failed 30575 1726867669.48780: getting the remaining hosts for this loop 30575 1726867669.48781: done getting the remaining hosts for this loop 30575 1726867669.48783: getting the next task for host managed_node3 30575 1726867669.48788: done getting next task for host managed_node3 30575 1726867669.48791: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867669.48794: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.48804: getting variables 30575 1726867669.48805: in VariableManager get_vars() 30575 1726867669.48832: Calling all_inventory to load vars for managed_node3 30575 1726867669.48834: Calling groups_inventory to load vars for managed_node3 30575 1726867669.48836: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.48842: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.48843: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.48845: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.49530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.50804: done with get_vars() 30575 1726867669.50824: done getting variables 30575 1726867669.50875: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:49 -0400 (0:00:00.827) 0:01:44.886 ****** 30575 1726867669.50907: entering _queue_task() for managed_node3/debug 30575 1726867669.51166: worker is 1 (out of 1 available) 30575 1726867669.51181: exiting _queue_task() for managed_node3/debug 30575 1726867669.51194: done queuing things up, now waiting for results queue to drain 30575 1726867669.51195: waiting for pending results... 30575 1726867669.51383: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867669.51480: in run() - task 0affcac9-a3a5-e081-a588-0000000021a4 30575 1726867669.51493: variable 'ansible_search_path' from source: unknown 30575 1726867669.51498: variable 'ansible_search_path' from source: unknown 30575 1726867669.51533: calling self._execute() 30575 1726867669.51602: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.51607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.51615: variable 'omit' from source: magic vars 30575 1726867669.51890: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.51899: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.51905: variable 'omit' from source: magic vars 30575 1726867669.51951: variable 'omit' from source: magic vars 30575 1726867669.52022: variable 'network_provider' from source: set_fact 30575 1726867669.52034: variable 'omit' from source: magic vars 30575 1726867669.52065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867669.52095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867669.52112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867669.52126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867669.52137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867669.52160: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867669.52163: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.52166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.52237: Set connection var ansible_pipelining to False 30575 1726867669.52240: Set connection var ansible_shell_type to sh 30575 1726867669.52244: Set connection var ansible_shell_executable to /bin/sh 30575 1726867669.52249: Set connection var ansible_timeout to 10 30575 1726867669.52254: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867669.52260: Set connection var ansible_connection to ssh 30575 1726867669.52280: variable 'ansible_shell_executable' from source: unknown 30575 1726867669.52283: variable 'ansible_connection' from source: unknown 30575 1726867669.52287: variable 'ansible_module_compression' from source: unknown 30575 1726867669.52289: variable 'ansible_shell_type' from source: unknown 30575 1726867669.52292: variable 'ansible_shell_executable' from source: unknown 30575 1726867669.52295: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.52297: variable 'ansible_pipelining' from source: unknown 30575 1726867669.52299: variable 'ansible_timeout' from source: unknown 30575 1726867669.52301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.52401: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867669.52412: variable 'omit' from source: magic vars 30575 1726867669.52415: starting attempt loop 30575 1726867669.52420: running the handler 30575 1726867669.52456: handler run complete 30575 1726867669.52467: attempt loop complete, returning result 30575 1726867669.52469: _execute() done 30575 1726867669.52472: dumping result to json 30575 1726867669.52475: done dumping result, returning 30575 1726867669.52483: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-0000000021a4] 30575 1726867669.52488: sending task result for task 0affcac9-a3a5-e081-a588-0000000021a4 30575 1726867669.52565: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021a4 30575 1726867669.52568: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867669.52634: no more pending results, returning what we have 30575 1726867669.52638: results queue empty 30575 1726867669.52638: checking for any_errors_fatal 30575 1726867669.52650: done checking for any_errors_fatal 30575 1726867669.52650: checking for max_fail_percentage 30575 1726867669.52651: done checking for max_fail_percentage 30575 1726867669.52652: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.52653: done checking to see if all hosts have failed 30575 1726867669.52654: getting the remaining hosts for this loop 30575 1726867669.52656: done getting the remaining hosts for this loop 30575 1726867669.52659: getting the next task for host managed_node3 30575 1726867669.52667: done getting next task for host managed_node3 30575 1726867669.52671: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867669.52675: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.52689: getting variables 30575 1726867669.52691: in VariableManager get_vars() 30575 1726867669.52730: Calling all_inventory to load vars for managed_node3 30575 1726867669.52732: Calling groups_inventory to load vars for managed_node3 30575 1726867669.52734: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.52742: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.52745: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.52747: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.57579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.58424: done with get_vars() 30575 1726867669.58441: done getting variables 30575 1726867669.58474: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:49 -0400 (0:00:00.075) 0:01:44.962 ****** 30575 1726867669.58501: entering _queue_task() for managed_node3/fail 30575 1726867669.58772: worker is 1 (out of 1 available) 30575 1726867669.58790: exiting _queue_task() for managed_node3/fail 30575 1726867669.58803: done queuing things up, now waiting for results queue to drain 30575 1726867669.58805: waiting for pending results... 30575 1726867669.58997: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867669.59102: in run() - task 0affcac9-a3a5-e081-a588-0000000021a5 30575 1726867669.59114: variable 'ansible_search_path' from source: unknown 30575 1726867669.59121: variable 'ansible_search_path' from source: unknown 30575 1726867669.59151: calling self._execute() 30575 1726867669.59228: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.59233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.59242: variable 'omit' from source: magic vars 30575 1726867669.59528: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.59538: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.59629: variable 'network_state' from source: role '' defaults 30575 1726867669.59637: Evaluated conditional (network_state != {}): False 30575 1726867669.59640: when evaluation is False, skipping this task 30575 1726867669.59644: _execute() done 30575 1726867669.59647: dumping result to json 30575 1726867669.59649: done dumping result, returning 30575 1726867669.59657: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-0000000021a5] 30575 1726867669.59662: sending task result for task 0affcac9-a3a5-e081-a588-0000000021a5 30575 1726867669.59751: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021a5 30575 1726867669.59754: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867669.59832: no more pending results, returning what we have 30575 1726867669.59835: results queue empty 30575 1726867669.59835: checking for any_errors_fatal 30575 1726867669.59842: done checking for any_errors_fatal 30575 1726867669.59843: checking for max_fail_percentage 30575 1726867669.59844: done checking for max_fail_percentage 30575 1726867669.59845: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.59846: done checking to see if all hosts have failed 30575 1726867669.59847: getting the remaining hosts for this loop 30575 1726867669.59848: done getting the remaining hosts for this loop 30575 1726867669.59852: getting the next task for host managed_node3 30575 1726867669.59859: done getting next task for host managed_node3 30575 1726867669.59864: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867669.59869: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.59888: getting variables 30575 1726867669.59890: in VariableManager get_vars() 30575 1726867669.59932: Calling all_inventory to load vars for managed_node3 30575 1726867669.59934: Calling groups_inventory to load vars for managed_node3 30575 1726867669.59937: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.59945: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.59947: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.59949: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.60695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.61664: done with get_vars() 30575 1726867669.61680: done getting variables 30575 1726867669.61722: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:49 -0400 (0:00:00.032) 0:01:44.995 ****** 30575 1726867669.61745: entering _queue_task() for managed_node3/fail 30575 1726867669.61956: worker is 1 (out of 1 available) 30575 1726867669.61968: exiting _queue_task() for managed_node3/fail 30575 1726867669.61983: done queuing things up, now waiting for results queue to drain 30575 1726867669.61984: waiting for pending results... 30575 1726867669.62152: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867669.62250: in run() - task 0affcac9-a3a5-e081-a588-0000000021a6 30575 1726867669.62260: variable 'ansible_search_path' from source: unknown 30575 1726867669.62264: variable 'ansible_search_path' from source: unknown 30575 1726867669.62292: calling self._execute() 30575 1726867669.62363: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.62367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.62375: variable 'omit' from source: magic vars 30575 1726867669.62637: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.62650: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.62735: variable 'network_state' from source: role '' defaults 30575 1726867669.62743: Evaluated conditional (network_state != {}): False 30575 1726867669.62746: when evaluation is False, skipping this task 30575 1726867669.62749: _execute() done 30575 1726867669.62754: dumping result to json 30575 1726867669.62756: done dumping result, returning 30575 1726867669.62769: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-0000000021a6] 30575 1726867669.62773: sending task result for task 0affcac9-a3a5-e081-a588-0000000021a6 30575 1726867669.62855: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021a6 30575 1726867669.62857: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867669.62910: no more pending results, returning what we have 30575 1726867669.62913: results queue empty 30575 1726867669.62914: checking for any_errors_fatal 30575 1726867669.62921: done checking for any_errors_fatal 30575 1726867669.62922: checking for max_fail_percentage 30575 1726867669.62924: done checking for max_fail_percentage 30575 1726867669.62924: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.62925: done checking to see if all hosts have failed 30575 1726867669.62926: getting the remaining hosts for this loop 30575 1726867669.62927: done getting the remaining hosts for this loop 30575 1726867669.62930: getting the next task for host managed_node3 30575 1726867669.62937: done getting next task for host managed_node3 30575 1726867669.62940: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867669.62945: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.62961: getting variables 30575 1726867669.62963: in VariableManager get_vars() 30575 1726867669.62999: Calling all_inventory to load vars for managed_node3 30575 1726867669.63001: Calling groups_inventory to load vars for managed_node3 30575 1726867669.63003: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.63010: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.63012: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.63015: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.63735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.64605: done with get_vars() 30575 1726867669.64621: done getting variables 30575 1726867669.64661: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:49 -0400 (0:00:00.029) 0:01:45.024 ****** 30575 1726867669.64685: entering _queue_task() for managed_node3/fail 30575 1726867669.64875: worker is 1 (out of 1 available) 30575 1726867669.64890: exiting _queue_task() for managed_node3/fail 30575 1726867669.64904: done queuing things up, now waiting for results queue to drain 30575 1726867669.64906: waiting for pending results... 30575 1726867669.65073: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867669.65163: in run() - task 0affcac9-a3a5-e081-a588-0000000021a7 30575 1726867669.65172: variable 'ansible_search_path' from source: unknown 30575 1726867669.65176: variable 'ansible_search_path' from source: unknown 30575 1726867669.65205: calling self._execute() 30575 1726867669.65275: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.65282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.65290: variable 'omit' from source: magic vars 30575 1726867669.65545: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.65553: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.65671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867669.67171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867669.67226: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867669.67251: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867669.67279: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867669.67299: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867669.67356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.67376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.67398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.67428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.67440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.67507: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.67523: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867669.67595: variable 'ansible_distribution' from source: facts 30575 1726867669.67599: variable '__network_rh_distros' from source: role '' defaults 30575 1726867669.67606: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867669.67764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.67781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.67798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.67843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.67873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.68082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.68086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.68089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.68091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.68093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.68095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.68115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.68149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.68196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.68228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.68559: variable 'network_connections' from source: include params 30575 1726867669.68575: variable 'interface' from source: play vars 30575 1726867669.68660: variable 'interface' from source: play vars 30575 1726867669.68676: variable 'network_state' from source: role '' defaults 30575 1726867669.68768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867669.69160: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867669.69189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867669.69214: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867669.69238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867669.69269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867669.69285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867669.69313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.69331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867669.69357: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867669.69360: when evaluation is False, skipping this task 30575 1726867669.69363: _execute() done 30575 1726867669.69366: dumping result to json 30575 1726867669.69368: done dumping result, returning 30575 1726867669.69375: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-0000000021a7] 30575 1726867669.69382: sending task result for task 0affcac9-a3a5-e081-a588-0000000021a7 30575 1726867669.69459: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021a7 30575 1726867669.69461: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867669.69508: no more pending results, returning what we have 30575 1726867669.69511: results queue empty 30575 1726867669.69512: checking for any_errors_fatal 30575 1726867669.69518: done checking for any_errors_fatal 30575 1726867669.69519: checking for max_fail_percentage 30575 1726867669.69520: done checking for max_fail_percentage 30575 1726867669.69521: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.69522: done checking to see if all hosts have failed 30575 1726867669.69523: getting the remaining hosts for this loop 30575 1726867669.69524: done getting the remaining hosts for this loop 30575 1726867669.69528: getting the next task for host managed_node3 30575 1726867669.69536: done getting next task for host managed_node3 30575 1726867669.69539: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867669.69544: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.69564: getting variables 30575 1726867669.69566: in VariableManager get_vars() 30575 1726867669.69614: Calling all_inventory to load vars for managed_node3 30575 1726867669.69617: Calling groups_inventory to load vars for managed_node3 30575 1726867669.69619: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.69627: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.69630: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.69632: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.70563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.72009: done with get_vars() 30575 1726867669.72031: done getting variables 30575 1726867669.72085: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:49 -0400 (0:00:00.074) 0:01:45.098 ****** 30575 1726867669.72116: entering _queue_task() for managed_node3/dnf 30575 1726867669.72431: worker is 1 (out of 1 available) 30575 1726867669.72444: exiting _queue_task() for managed_node3/dnf 30575 1726867669.72457: done queuing things up, now waiting for results queue to drain 30575 1726867669.72459: waiting for pending results... 30575 1726867669.72896: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867669.72933: in run() - task 0affcac9-a3a5-e081-a588-0000000021a8 30575 1726867669.72953: variable 'ansible_search_path' from source: unknown 30575 1726867669.72962: variable 'ansible_search_path' from source: unknown 30575 1726867669.73008: calling self._execute() 30575 1726867669.73113: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.73125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.73140: variable 'omit' from source: magic vars 30575 1726867669.73640: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.73643: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.73748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867669.75394: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867669.75448: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867669.75476: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867669.75504: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867669.75526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867669.75583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.75606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.75626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.75652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.75663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.75767: variable 'ansible_distribution' from source: facts 30575 1726867669.75771: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.75883: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867669.75902: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867669.76030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.76061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.76096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.76139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.76158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.76202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.76230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.76257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.76303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.76323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.76363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.76393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.76425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.76469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.76492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.76650: variable 'network_connections' from source: include params 30575 1726867669.76706: variable 'interface' from source: play vars 30575 1726867669.76764: variable 'interface' from source: play vars 30575 1726867669.76793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867669.76915: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867669.76944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867669.76970: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867669.76995: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867669.77025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867669.77041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867669.77069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.77085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867669.77127: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867669.77271: variable 'network_connections' from source: include params 30575 1726867669.77275: variable 'interface' from source: play vars 30575 1726867669.77394: variable 'interface' from source: play vars 30575 1726867669.77397: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867669.77399: when evaluation is False, skipping this task 30575 1726867669.77401: _execute() done 30575 1726867669.77402: dumping result to json 30575 1726867669.77404: done dumping result, returning 30575 1726867669.77406: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000021a8] 30575 1726867669.77407: sending task result for task 0affcac9-a3a5-e081-a588-0000000021a8 30575 1726867669.77467: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021a8 30575 1726867669.77470: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867669.77523: no more pending results, returning what we have 30575 1726867669.77526: results queue empty 30575 1726867669.77527: checking for any_errors_fatal 30575 1726867669.77534: done checking for any_errors_fatal 30575 1726867669.77534: checking for max_fail_percentage 30575 1726867669.77536: done checking for max_fail_percentage 30575 1726867669.77537: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.77538: done checking to see if all hosts have failed 30575 1726867669.77538: getting the remaining hosts for this loop 30575 1726867669.77540: done getting the remaining hosts for this loop 30575 1726867669.77543: getting the next task for host managed_node3 30575 1726867669.77552: done getting next task for host managed_node3 30575 1726867669.77555: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867669.77560: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.77581: getting variables 30575 1726867669.77583: in VariableManager get_vars() 30575 1726867669.77624: Calling all_inventory to load vars for managed_node3 30575 1726867669.77627: Calling groups_inventory to load vars for managed_node3 30575 1726867669.77629: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.77637: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.77640: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.77642: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.78435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.79867: done with get_vars() 30575 1726867669.79895: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867669.79979: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:49 -0400 (0:00:00.078) 0:01:45.177 ****** 30575 1726867669.80016: entering _queue_task() for managed_node3/yum 30575 1726867669.80365: worker is 1 (out of 1 available) 30575 1726867669.80380: exiting _queue_task() for managed_node3/yum 30575 1726867669.80393: done queuing things up, now waiting for results queue to drain 30575 1726867669.80395: waiting for pending results... 30575 1726867669.80912: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867669.80920: in run() - task 0affcac9-a3a5-e081-a588-0000000021a9 30575 1726867669.80924: variable 'ansible_search_path' from source: unknown 30575 1726867669.80927: variable 'ansible_search_path' from source: unknown 30575 1726867669.80930: calling self._execute() 30575 1726867669.80962: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.80974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.80992: variable 'omit' from source: magic vars 30575 1726867669.81364: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.81383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.81558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867669.84253: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867669.84333: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867669.84393: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867669.84433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867669.84465: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867669.84591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.84595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.84617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.84661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.84684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.84790: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.84819: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867669.84914: when evaluation is False, skipping this task 30575 1726867669.84917: _execute() done 30575 1726867669.84919: dumping result to json 30575 1726867669.84922: done dumping result, returning 30575 1726867669.84924: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000021a9] 30575 1726867669.84926: sending task result for task 0affcac9-a3a5-e081-a588-0000000021a9 30575 1726867669.85000: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021a9 30575 1726867669.85003: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867669.85067: no more pending results, returning what we have 30575 1726867669.85071: results queue empty 30575 1726867669.85072: checking for any_errors_fatal 30575 1726867669.85080: done checking for any_errors_fatal 30575 1726867669.85081: checking for max_fail_percentage 30575 1726867669.85083: done checking for max_fail_percentage 30575 1726867669.85084: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.85085: done checking to see if all hosts have failed 30575 1726867669.85086: getting the remaining hosts for this loop 30575 1726867669.85088: done getting the remaining hosts for this loop 30575 1726867669.85092: getting the next task for host managed_node3 30575 1726867669.85102: done getting next task for host managed_node3 30575 1726867669.85106: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867669.85112: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.85135: getting variables 30575 1726867669.85137: in VariableManager get_vars() 30575 1726867669.85290: Calling all_inventory to load vars for managed_node3 30575 1726867669.85293: Calling groups_inventory to load vars for managed_node3 30575 1726867669.85295: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.85305: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.85309: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.85313: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.87093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.88618: done with get_vars() 30575 1726867669.88643: done getting variables 30575 1726867669.88708: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:49 -0400 (0:00:00.087) 0:01:45.264 ****** 30575 1726867669.88745: entering _queue_task() for managed_node3/fail 30575 1726867669.89126: worker is 1 (out of 1 available) 30575 1726867669.89141: exiting _queue_task() for managed_node3/fail 30575 1726867669.89156: done queuing things up, now waiting for results queue to drain 30575 1726867669.89157: waiting for pending results... 30575 1726867669.89507: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867669.89634: in run() - task 0affcac9-a3a5-e081-a588-0000000021aa 30575 1726867669.89784: variable 'ansible_search_path' from source: unknown 30575 1726867669.89788: variable 'ansible_search_path' from source: unknown 30575 1726867669.89791: calling self._execute() 30575 1726867669.89809: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.89821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.89837: variable 'omit' from source: magic vars 30575 1726867669.90224: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.90247: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.90376: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867669.90583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867669.92796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867669.92882: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867669.92923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867669.92966: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867669.92996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867669.93170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.93173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.93176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.93188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.93208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.93255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.93287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.93320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.93362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.93383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.93427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867669.93451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867669.93480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.93582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867669.93585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867669.93732: variable 'network_connections' from source: include params 30575 1726867669.93750: variable 'interface' from source: play vars 30575 1726867669.93823: variable 'interface' from source: play vars 30575 1726867669.93904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867669.94083: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867669.94135: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867669.94174: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867669.94209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867669.94257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867669.94379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867669.94383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867669.94386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867669.94416: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867669.94681: variable 'network_connections' from source: include params 30575 1726867669.94692: variable 'interface' from source: play vars 30575 1726867669.94758: variable 'interface' from source: play vars 30575 1726867669.94794: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867669.94802: when evaluation is False, skipping this task 30575 1726867669.94810: _execute() done 30575 1726867669.94818: dumping result to json 30575 1726867669.94831: done dumping result, returning 30575 1726867669.94843: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000021aa] 30575 1726867669.94853: sending task result for task 0affcac9-a3a5-e081-a588-0000000021aa 30575 1726867669.95070: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021aa 30575 1726867669.95073: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867669.95127: no more pending results, returning what we have 30575 1726867669.95131: results queue empty 30575 1726867669.95132: checking for any_errors_fatal 30575 1726867669.95138: done checking for any_errors_fatal 30575 1726867669.95139: checking for max_fail_percentage 30575 1726867669.95141: done checking for max_fail_percentage 30575 1726867669.95142: checking to see if all hosts have failed and the running result is not ok 30575 1726867669.95143: done checking to see if all hosts have failed 30575 1726867669.95144: getting the remaining hosts for this loop 30575 1726867669.95145: done getting the remaining hosts for this loop 30575 1726867669.95150: getting the next task for host managed_node3 30575 1726867669.95159: done getting next task for host managed_node3 30575 1726867669.95163: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867669.95169: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867669.95193: getting variables 30575 1726867669.95195: in VariableManager get_vars() 30575 1726867669.95245: Calling all_inventory to load vars for managed_node3 30575 1726867669.95248: Calling groups_inventory to load vars for managed_node3 30575 1726867669.95250: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867669.95262: Calling all_plugins_play to load vars for managed_node3 30575 1726867669.95265: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867669.95268: Calling groups_plugins_play to load vars for managed_node3 30575 1726867669.96850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867669.97741: done with get_vars() 30575 1726867669.97758: done getting variables 30575 1726867669.97802: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:49 -0400 (0:00:00.090) 0:01:45.355 ****** 30575 1726867669.97830: entering _queue_task() for managed_node3/package 30575 1726867669.98074: worker is 1 (out of 1 available) 30575 1726867669.98088: exiting _queue_task() for managed_node3/package 30575 1726867669.98103: done queuing things up, now waiting for results queue to drain 30575 1726867669.98105: waiting for pending results... 30575 1726867669.98304: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867669.98428: in run() - task 0affcac9-a3a5-e081-a588-0000000021ab 30575 1726867669.98432: variable 'ansible_search_path' from source: unknown 30575 1726867669.98435: variable 'ansible_search_path' from source: unknown 30575 1726867669.98491: calling self._execute() 30575 1726867669.98570: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867669.98579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867669.98683: variable 'omit' from source: magic vars 30575 1726867669.98998: variable 'ansible_distribution_major_version' from source: facts 30575 1726867669.99030: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867669.99218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867669.99582: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867669.99586: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867669.99588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867669.99960: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867670.00080: variable 'network_packages' from source: role '' defaults 30575 1726867670.00189: variable '__network_provider_setup' from source: role '' defaults 30575 1726867670.00209: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867670.00272: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867670.00281: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867670.00324: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867670.00443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867670.01751: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867670.01794: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867670.01822: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867670.01846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867670.01865: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867670.01926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.01955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.01972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.02156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.02159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.02162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.02164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.02167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.02169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.02174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.02403: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867670.02516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.02544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.02572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.02625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.02643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.02743: variable 'ansible_python' from source: facts 30575 1726867670.02765: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867670.02856: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867670.02945: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867670.03028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.03049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.03065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.03092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.03102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.03136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.03163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.03181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.03205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.03216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.03315: variable 'network_connections' from source: include params 30575 1726867670.03322: variable 'interface' from source: play vars 30575 1726867670.03393: variable 'interface' from source: play vars 30575 1726867670.03445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867670.03464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867670.03488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.03509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867670.03548: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867670.03726: variable 'network_connections' from source: include params 30575 1726867670.03729: variable 'interface' from source: play vars 30575 1726867670.03797: variable 'interface' from source: play vars 30575 1726867670.03838: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867670.03891: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867670.04087: variable 'network_connections' from source: include params 30575 1726867670.04091: variable 'interface' from source: play vars 30575 1726867670.04139: variable 'interface' from source: play vars 30575 1726867670.04156: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867670.04209: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867670.04407: variable 'network_connections' from source: include params 30575 1726867670.04410: variable 'interface' from source: play vars 30575 1726867670.04458: variable 'interface' from source: play vars 30575 1726867670.04503: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867670.04546: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867670.04551: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867670.04598: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867670.04735: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867670.05031: variable 'network_connections' from source: include params 30575 1726867670.05035: variable 'interface' from source: play vars 30575 1726867670.05076: variable 'interface' from source: play vars 30575 1726867670.05085: variable 'ansible_distribution' from source: facts 30575 1726867670.05087: variable '__network_rh_distros' from source: role '' defaults 30575 1726867670.05093: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.05117: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867670.05223: variable 'ansible_distribution' from source: facts 30575 1726867670.05228: variable '__network_rh_distros' from source: role '' defaults 30575 1726867670.05231: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.05240: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867670.05348: variable 'ansible_distribution' from source: facts 30575 1726867670.05352: variable '__network_rh_distros' from source: role '' defaults 30575 1726867670.05354: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.05379: variable 'network_provider' from source: set_fact 30575 1726867670.05390: variable 'ansible_facts' from source: unknown 30575 1726867670.05754: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867670.05758: when evaluation is False, skipping this task 30575 1726867670.05760: _execute() done 30575 1726867670.05763: dumping result to json 30575 1726867670.05765: done dumping result, returning 30575 1726867670.05773: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-0000000021ab] 30575 1726867670.05784: sending task result for task 0affcac9-a3a5-e081-a588-0000000021ab 30575 1726867670.05867: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021ab 30575 1726867670.05869: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867670.05935: no more pending results, returning what we have 30575 1726867670.05938: results queue empty 30575 1726867670.05939: checking for any_errors_fatal 30575 1726867670.05946: done checking for any_errors_fatal 30575 1726867670.05947: checking for max_fail_percentage 30575 1726867670.05949: done checking for max_fail_percentage 30575 1726867670.05949: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.05951: done checking to see if all hosts have failed 30575 1726867670.05951: getting the remaining hosts for this loop 30575 1726867670.05953: done getting the remaining hosts for this loop 30575 1726867670.05957: getting the next task for host managed_node3 30575 1726867670.05965: done getting next task for host managed_node3 30575 1726867670.05969: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867670.05974: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.05998: getting variables 30575 1726867670.06000: in VariableManager get_vars() 30575 1726867670.06048: Calling all_inventory to load vars for managed_node3 30575 1726867670.06051: Calling groups_inventory to load vars for managed_node3 30575 1726867670.06053: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.06062: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.06064: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.06067: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.06995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.07865: done with get_vars() 30575 1726867670.07882: done getting variables 30575 1726867670.07928: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:50 -0400 (0:00:00.101) 0:01:45.457 ****** 30575 1726867670.07952: entering _queue_task() for managed_node3/package 30575 1726867670.08193: worker is 1 (out of 1 available) 30575 1726867670.08209: exiting _queue_task() for managed_node3/package 30575 1726867670.08220: done queuing things up, now waiting for results queue to drain 30575 1726867670.08222: waiting for pending results... 30575 1726867670.08425: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867670.08521: in run() - task 0affcac9-a3a5-e081-a588-0000000021ac 30575 1726867670.08532: variable 'ansible_search_path' from source: unknown 30575 1726867670.08537: variable 'ansible_search_path' from source: unknown 30575 1726867670.08565: calling self._execute() 30575 1726867670.08645: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.08649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.08657: variable 'omit' from source: magic vars 30575 1726867670.08939: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.08947: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.09038: variable 'network_state' from source: role '' defaults 30575 1726867670.09048: Evaluated conditional (network_state != {}): False 30575 1726867670.09051: when evaluation is False, skipping this task 30575 1726867670.09053: _execute() done 30575 1726867670.09058: dumping result to json 30575 1726867670.09060: done dumping result, returning 30575 1726867670.09069: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000021ac] 30575 1726867670.09072: sending task result for task 0affcac9-a3a5-e081-a588-0000000021ac 30575 1726867670.09162: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021ac 30575 1726867670.09165: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867670.09246: no more pending results, returning what we have 30575 1726867670.09250: results queue empty 30575 1726867670.09251: checking for any_errors_fatal 30575 1726867670.09255: done checking for any_errors_fatal 30575 1726867670.09256: checking for max_fail_percentage 30575 1726867670.09258: done checking for max_fail_percentage 30575 1726867670.09258: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.09259: done checking to see if all hosts have failed 30575 1726867670.09260: getting the remaining hosts for this loop 30575 1726867670.09262: done getting the remaining hosts for this loop 30575 1726867670.09265: getting the next task for host managed_node3 30575 1726867670.09272: done getting next task for host managed_node3 30575 1726867670.09278: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867670.09283: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.09301: getting variables 30575 1726867670.09302: in VariableManager get_vars() 30575 1726867670.09340: Calling all_inventory to load vars for managed_node3 30575 1726867670.09342: Calling groups_inventory to load vars for managed_node3 30575 1726867670.09344: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.09352: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.09354: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.09358: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.10093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.11066: done with get_vars() 30575 1726867670.11083: done getting variables 30575 1726867670.11127: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:50 -0400 (0:00:00.031) 0:01:45.489 ****** 30575 1726867670.11152: entering _queue_task() for managed_node3/package 30575 1726867670.11374: worker is 1 (out of 1 available) 30575 1726867670.11390: exiting _queue_task() for managed_node3/package 30575 1726867670.11404: done queuing things up, now waiting for results queue to drain 30575 1726867670.11405: waiting for pending results... 30575 1726867670.11592: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867670.11675: in run() - task 0affcac9-a3a5-e081-a588-0000000021ad 30575 1726867670.11687: variable 'ansible_search_path' from source: unknown 30575 1726867670.11691: variable 'ansible_search_path' from source: unknown 30575 1726867670.11722: calling self._execute() 30575 1726867670.11798: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.11802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.11811: variable 'omit' from source: magic vars 30575 1726867670.12084: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.12093: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.12171: variable 'network_state' from source: role '' defaults 30575 1726867670.12183: Evaluated conditional (network_state != {}): False 30575 1726867670.12191: when evaluation is False, skipping this task 30575 1726867670.12193: _execute() done 30575 1726867670.12196: dumping result to json 30575 1726867670.12199: done dumping result, returning 30575 1726867670.12208: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000021ad] 30575 1726867670.12212: sending task result for task 0affcac9-a3a5-e081-a588-0000000021ad 30575 1726867670.12302: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021ad 30575 1726867670.12305: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867670.12349: no more pending results, returning what we have 30575 1726867670.12353: results queue empty 30575 1726867670.12353: checking for any_errors_fatal 30575 1726867670.12358: done checking for any_errors_fatal 30575 1726867670.12359: checking for max_fail_percentage 30575 1726867670.12360: done checking for max_fail_percentage 30575 1726867670.12361: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.12362: done checking to see if all hosts have failed 30575 1726867670.12363: getting the remaining hosts for this loop 30575 1726867670.12364: done getting the remaining hosts for this loop 30575 1726867670.12367: getting the next task for host managed_node3 30575 1726867670.12375: done getting next task for host managed_node3 30575 1726867670.12384: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867670.12389: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.12407: getting variables 30575 1726867670.12408: in VariableManager get_vars() 30575 1726867670.12444: Calling all_inventory to load vars for managed_node3 30575 1726867670.12446: Calling groups_inventory to load vars for managed_node3 30575 1726867670.12448: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.12456: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.12459: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.12462: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.13197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.14071: done with get_vars() 30575 1726867670.14087: done getting variables 30575 1726867670.14130: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:50 -0400 (0:00:00.030) 0:01:45.519 ****** 30575 1726867670.14155: entering _queue_task() for managed_node3/service 30575 1726867670.14365: worker is 1 (out of 1 available) 30575 1726867670.14381: exiting _queue_task() for managed_node3/service 30575 1726867670.14394: done queuing things up, now waiting for results queue to drain 30575 1726867670.14396: waiting for pending results... 30575 1726867670.14590: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867670.14675: in run() - task 0affcac9-a3a5-e081-a588-0000000021ae 30575 1726867670.14688: variable 'ansible_search_path' from source: unknown 30575 1726867670.14692: variable 'ansible_search_path' from source: unknown 30575 1726867670.14725: calling self._execute() 30575 1726867670.14797: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.14800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.14809: variable 'omit' from source: magic vars 30575 1726867670.15095: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.15104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.15189: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867670.15320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867670.16810: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867670.17123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867670.17150: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867670.17175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867670.17196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867670.17257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.17279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.17296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.17327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.17338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.17370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.17389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.17406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.17437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.17448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.17475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.17492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.17509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.17541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.17547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.17653: variable 'network_connections' from source: include params 30575 1726867670.17664: variable 'interface' from source: play vars 30575 1726867670.17709: variable 'interface' from source: play vars 30575 1726867670.17760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867670.17864: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867670.17898: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867670.17922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867670.17942: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867670.17988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867670.18004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867670.18022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.18040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867670.18089: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867670.18233: variable 'network_connections' from source: include params 30575 1726867670.18236: variable 'interface' from source: play vars 30575 1726867670.18279: variable 'interface' from source: play vars 30575 1726867670.18306: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867670.18309: when evaluation is False, skipping this task 30575 1726867670.18312: _execute() done 30575 1726867670.18314: dumping result to json 30575 1726867670.18320: done dumping result, returning 30575 1726867670.18323: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000021ae] 30575 1726867670.18327: sending task result for task 0affcac9-a3a5-e081-a588-0000000021ae 30575 1726867670.18405: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021ae 30575 1726867670.18415: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867670.18460: no more pending results, returning what we have 30575 1726867670.18464: results queue empty 30575 1726867670.18465: checking for any_errors_fatal 30575 1726867670.18471: done checking for any_errors_fatal 30575 1726867670.18472: checking for max_fail_percentage 30575 1726867670.18473: done checking for max_fail_percentage 30575 1726867670.18474: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.18475: done checking to see if all hosts have failed 30575 1726867670.18476: getting the remaining hosts for this loop 30575 1726867670.18479: done getting the remaining hosts for this loop 30575 1726867670.18483: getting the next task for host managed_node3 30575 1726867670.18491: done getting next task for host managed_node3 30575 1726867670.18494: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867670.18498: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.18520: getting variables 30575 1726867670.18521: in VariableManager get_vars() 30575 1726867670.18561: Calling all_inventory to load vars for managed_node3 30575 1726867670.18563: Calling groups_inventory to load vars for managed_node3 30575 1726867670.18565: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.18574: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.18576: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.18586: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.19523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.20374: done with get_vars() 30575 1726867670.20391: done getting variables 30575 1726867670.20435: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:50 -0400 (0:00:00.063) 0:01:45.582 ****** 30575 1726867670.20459: entering _queue_task() for managed_node3/service 30575 1726867670.20704: worker is 1 (out of 1 available) 30575 1726867670.20718: exiting _queue_task() for managed_node3/service 30575 1726867670.20731: done queuing things up, now waiting for results queue to drain 30575 1726867670.20732: waiting for pending results... 30575 1726867670.20930: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867670.21033: in run() - task 0affcac9-a3a5-e081-a588-0000000021af 30575 1726867670.21044: variable 'ansible_search_path' from source: unknown 30575 1726867670.21048: variable 'ansible_search_path' from source: unknown 30575 1726867670.21081: calling self._execute() 30575 1726867670.21157: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.21161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.21169: variable 'omit' from source: magic vars 30575 1726867670.21457: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.21466: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.21582: variable 'network_provider' from source: set_fact 30575 1726867670.21586: variable 'network_state' from source: role '' defaults 30575 1726867670.21596: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867670.21602: variable 'omit' from source: magic vars 30575 1726867670.21644: variable 'omit' from source: magic vars 30575 1726867670.21663: variable 'network_service_name' from source: role '' defaults 30575 1726867670.21710: variable 'network_service_name' from source: role '' defaults 30575 1726867670.21786: variable '__network_provider_setup' from source: role '' defaults 30575 1726867670.21790: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867670.21837: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867670.21845: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867670.21890: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867670.22037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867670.23488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867670.23539: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867670.23573: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867670.23596: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867670.23620: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867670.23681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.23701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.23721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.23745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.23755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.23792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.23809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.23826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.23850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.23861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.24018: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867670.24086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.24104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.24125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.24149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.24160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.24225: variable 'ansible_python' from source: facts 30575 1726867670.24234: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867670.24288: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867670.24343: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867670.24422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.24438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.24464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.24490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.24501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.24537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.24559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.24576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.24603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.24613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.24707: variable 'network_connections' from source: include params 30575 1726867670.24713: variable 'interface' from source: play vars 30575 1726867670.24767: variable 'interface' from source: play vars 30575 1726867670.24841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867670.24974: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867670.25013: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867670.25045: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867670.25074: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867670.25123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867670.25144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867670.25165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.25189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867670.25231: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867670.25404: variable 'network_connections' from source: include params 30575 1726867670.25409: variable 'interface' from source: play vars 30575 1726867670.25464: variable 'interface' from source: play vars 30575 1726867670.25499: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867670.25557: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867670.25741: variable 'network_connections' from source: include params 30575 1726867670.25744: variable 'interface' from source: play vars 30575 1726867670.25796: variable 'interface' from source: play vars 30575 1726867670.25814: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867670.25870: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867670.26052: variable 'network_connections' from source: include params 30575 1726867670.26056: variable 'interface' from source: play vars 30575 1726867670.26108: variable 'interface' from source: play vars 30575 1726867670.26153: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867670.26197: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867670.26201: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867670.26245: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867670.26380: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867670.26694: variable 'network_connections' from source: include params 30575 1726867670.26698: variable 'interface' from source: play vars 30575 1726867670.26742: variable 'interface' from source: play vars 30575 1726867670.26750: variable 'ansible_distribution' from source: facts 30575 1726867670.26753: variable '__network_rh_distros' from source: role '' defaults 30575 1726867670.26758: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.26781: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867670.26894: variable 'ansible_distribution' from source: facts 30575 1726867670.26897: variable '__network_rh_distros' from source: role '' defaults 30575 1726867670.26902: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.26910: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867670.27022: variable 'ansible_distribution' from source: facts 30575 1726867670.27026: variable '__network_rh_distros' from source: role '' defaults 30575 1726867670.27031: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.27055: variable 'network_provider' from source: set_fact 30575 1726867670.27075: variable 'omit' from source: magic vars 30575 1726867670.27098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867670.27117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867670.27135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867670.27148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867670.27156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867670.27183: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867670.27186: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.27190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.27257: Set connection var ansible_pipelining to False 30575 1726867670.27260: Set connection var ansible_shell_type to sh 30575 1726867670.27265: Set connection var ansible_shell_executable to /bin/sh 30575 1726867670.27271: Set connection var ansible_timeout to 10 30575 1726867670.27275: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867670.27284: Set connection var ansible_connection to ssh 30575 1726867670.27308: variable 'ansible_shell_executable' from source: unknown 30575 1726867670.27311: variable 'ansible_connection' from source: unknown 30575 1726867670.27313: variable 'ansible_module_compression' from source: unknown 30575 1726867670.27318: variable 'ansible_shell_type' from source: unknown 30575 1726867670.27320: variable 'ansible_shell_executable' from source: unknown 30575 1726867670.27322: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.27325: variable 'ansible_pipelining' from source: unknown 30575 1726867670.27326: variable 'ansible_timeout' from source: unknown 30575 1726867670.27329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.27408: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867670.27419: variable 'omit' from source: magic vars 30575 1726867670.27422: starting attempt loop 30575 1726867670.27425: running the handler 30575 1726867670.27484: variable 'ansible_facts' from source: unknown 30575 1726867670.27952: _low_level_execute_command(): starting 30575 1726867670.27958: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867670.28456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.28460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.28464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.28466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.28524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867670.28527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867670.28529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.28588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867670.30286: stdout chunk (state=3): >>>/root <<< 30575 1726867670.30384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867670.30419: stderr chunk (state=3): >>><<< 30575 1726867670.30423: stdout chunk (state=3): >>><<< 30575 1726867670.30442: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867670.30452: _low_level_execute_command(): starting 30575 1726867670.30457: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306 `" && echo ansible-tmp-1726867670.304419-35478-121924493180306="` echo /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306 `" ) && sleep 0' 30575 1726867670.30913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867670.30919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867670.30921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.30923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.30925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.30978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867670.30988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.31029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867670.32895: stdout chunk (state=3): >>>ansible-tmp-1726867670.304419-35478-121924493180306=/root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306 <<< 30575 1726867670.33005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867670.33037: stderr chunk (state=3): >>><<< 30575 1726867670.33040: stdout chunk (state=3): >>><<< 30575 1726867670.33053: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867670.304419-35478-121924493180306=/root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867670.33081: variable 'ansible_module_compression' from source: unknown 30575 1726867670.33123: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867670.33178: variable 'ansible_facts' from source: unknown 30575 1726867670.33319: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/AnsiballZ_systemd.py 30575 1726867670.33421: Sending initial data 30575 1726867670.33424: Sent initial data (155 bytes) 30575 1726867670.33873: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867670.33876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867670.33884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.33888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.33890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.33937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867670.33941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.33989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867670.35527: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30575 1726867670.35531: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867670.35566: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867670.35622: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpgc_wv3wp /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/AnsiballZ_systemd.py <<< 30575 1726867670.35625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/AnsiballZ_systemd.py" <<< 30575 1726867670.35662: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpgc_wv3wp" to remote "/root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/AnsiballZ_systemd.py" <<< 30575 1726867670.35667: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/AnsiballZ_systemd.py" <<< 30575 1726867670.36739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867670.36780: stderr chunk (state=3): >>><<< 30575 1726867670.36783: stdout chunk (state=3): >>><<< 30575 1726867670.36819: done transferring module to remote 30575 1726867670.36837: _low_level_execute_command(): starting 30575 1726867670.36840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/ /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/AnsiballZ_systemd.py && sleep 0' 30575 1726867670.37290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867670.37293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867670.37295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867670.37297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867670.37299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.37348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867670.37352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.37403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867670.39151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867670.39178: stderr chunk (state=3): >>><<< 30575 1726867670.39181: stdout chunk (state=3): >>><<< 30575 1726867670.39194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867670.39197: _low_level_execute_command(): starting 30575 1726867670.39202: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/AnsiballZ_systemd.py && sleep 0' 30575 1726867670.39645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867670.39648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.39651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867670.39653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867670.39655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.39704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867670.39716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.39760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867670.68672: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10522624", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326881792", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1990022000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30575 1726867670.68695: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system<<< 30575 1726867670.68705: stdout chunk (state=3): >>>.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867670.70534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867670.70565: stderr chunk (state=3): >>><<< 30575 1726867670.70568: stdout chunk (state=3): >>><<< 30575 1726867670.70589: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10522624", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326881792", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1990022000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867670.70712: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867670.70733: _low_level_execute_command(): starting 30575 1726867670.70736: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867670.304419-35478-121924493180306/ > /dev/null 2>&1 && sleep 0' 30575 1726867670.71200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.71203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.71205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867670.71207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867670.71209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.71262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867670.71269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867670.71271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.71309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867670.73144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867670.73170: stderr chunk (state=3): >>><<< 30575 1726867670.73173: stdout chunk (state=3): >>><<< 30575 1726867670.73187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867670.73193: handler run complete 30575 1726867670.73233: attempt loop complete, returning result 30575 1726867670.73237: _execute() done 30575 1726867670.73239: dumping result to json 30575 1726867670.73251: done dumping result, returning 30575 1726867670.73259: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-0000000021af] 30575 1726867670.73266: sending task result for task 0affcac9-a3a5-e081-a588-0000000021af 30575 1726867670.73505: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021af 30575 1726867670.73507: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867670.73561: no more pending results, returning what we have 30575 1726867670.73564: results queue empty 30575 1726867670.73565: checking for any_errors_fatal 30575 1726867670.73571: done checking for any_errors_fatal 30575 1726867670.73572: checking for max_fail_percentage 30575 1726867670.73573: done checking for max_fail_percentage 30575 1726867670.73574: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.73575: done checking to see if all hosts have failed 30575 1726867670.73576: getting the remaining hosts for this loop 30575 1726867670.73579: done getting the remaining hosts for this loop 30575 1726867670.73583: getting the next task for host managed_node3 30575 1726867670.73591: done getting next task for host managed_node3 30575 1726867670.73594: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867670.73599: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.73610: getting variables 30575 1726867670.73612: in VariableManager get_vars() 30575 1726867670.73652: Calling all_inventory to load vars for managed_node3 30575 1726867670.73654: Calling groups_inventory to load vars for managed_node3 30575 1726867670.73656: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.73665: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.73668: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.73670: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.74485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.75482: done with get_vars() 30575 1726867670.75499: done getting variables 30575 1726867670.75546: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:50 -0400 (0:00:00.551) 0:01:46.133 ****** 30575 1726867670.75575: entering _queue_task() for managed_node3/service 30575 1726867670.75843: worker is 1 (out of 1 available) 30575 1726867670.75856: exiting _queue_task() for managed_node3/service 30575 1726867670.75869: done queuing things up, now waiting for results queue to drain 30575 1726867670.75871: waiting for pending results... 30575 1726867670.76062: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867670.76157: in run() - task 0affcac9-a3a5-e081-a588-0000000021b0 30575 1726867670.76169: variable 'ansible_search_path' from source: unknown 30575 1726867670.76174: variable 'ansible_search_path' from source: unknown 30575 1726867670.76205: calling self._execute() 30575 1726867670.76282: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.76285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.76294: variable 'omit' from source: magic vars 30575 1726867670.76581: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.76590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.76674: variable 'network_provider' from source: set_fact 30575 1726867670.76680: Evaluated conditional (network_provider == "nm"): True 30575 1726867670.76743: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867670.76810: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867670.76934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867670.78381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867670.78431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867670.78458: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867670.78486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867670.78509: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867670.78579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.78601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.78623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.78649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.78660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.78696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.78714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.78735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.78759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.78770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.78798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.78814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.78836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.78859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.78871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.78964: variable 'network_connections' from source: include params 30575 1726867670.78974: variable 'interface' from source: play vars 30575 1726867670.79023: variable 'interface' from source: play vars 30575 1726867670.79076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867670.79190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867670.79218: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867670.79242: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867670.79265: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867670.79299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867670.79314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867670.79334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.79352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867670.79392: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867670.79550: variable 'network_connections' from source: include params 30575 1726867670.79553: variable 'interface' from source: play vars 30575 1726867670.79597: variable 'interface' from source: play vars 30575 1726867670.79631: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867670.79635: when evaluation is False, skipping this task 30575 1726867670.79637: _execute() done 30575 1726867670.79640: dumping result to json 30575 1726867670.79642: done dumping result, returning 30575 1726867670.79650: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-0000000021b0] 30575 1726867670.79661: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b0 30575 1726867670.79742: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b0 30575 1726867670.79745: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867670.79790: no more pending results, returning what we have 30575 1726867670.79793: results queue empty 30575 1726867670.79794: checking for any_errors_fatal 30575 1726867670.79820: done checking for any_errors_fatal 30575 1726867670.79821: checking for max_fail_percentage 30575 1726867670.79823: done checking for max_fail_percentage 30575 1726867670.79823: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.79824: done checking to see if all hosts have failed 30575 1726867670.79825: getting the remaining hosts for this loop 30575 1726867670.79826: done getting the remaining hosts for this loop 30575 1726867670.79830: getting the next task for host managed_node3 30575 1726867670.79839: done getting next task for host managed_node3 30575 1726867670.79843: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867670.79847: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.79867: getting variables 30575 1726867670.79869: in VariableManager get_vars() 30575 1726867670.79922: Calling all_inventory to load vars for managed_node3 30575 1726867670.79924: Calling groups_inventory to load vars for managed_node3 30575 1726867670.79926: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.79935: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.79937: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.79940: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.80739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.81606: done with get_vars() 30575 1726867670.81624: done getting variables 30575 1726867670.81664: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:50 -0400 (0:00:00.061) 0:01:46.194 ****** 30575 1726867670.81689: entering _queue_task() for managed_node3/service 30575 1726867670.81919: worker is 1 (out of 1 available) 30575 1726867670.81934: exiting _queue_task() for managed_node3/service 30575 1726867670.81948: done queuing things up, now waiting for results queue to drain 30575 1726867670.81950: waiting for pending results... 30575 1726867670.82138: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867670.82234: in run() - task 0affcac9-a3a5-e081-a588-0000000021b1 30575 1726867670.82246: variable 'ansible_search_path' from source: unknown 30575 1726867670.82251: variable 'ansible_search_path' from source: unknown 30575 1726867670.82279: calling self._execute() 30575 1726867670.82353: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.82356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.82364: variable 'omit' from source: magic vars 30575 1726867670.82635: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.82645: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.82727: variable 'network_provider' from source: set_fact 30575 1726867670.82731: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867670.82734: when evaluation is False, skipping this task 30575 1726867670.82737: _execute() done 30575 1726867670.82741: dumping result to json 30575 1726867670.82744: done dumping result, returning 30575 1726867670.82752: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-0000000021b1] 30575 1726867670.82757: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b1 30575 1726867670.82840: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b1 30575 1726867670.82843: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867670.82889: no more pending results, returning what we have 30575 1726867670.82892: results queue empty 30575 1726867670.82893: checking for any_errors_fatal 30575 1726867670.82900: done checking for any_errors_fatal 30575 1726867670.82901: checking for max_fail_percentage 30575 1726867670.82903: done checking for max_fail_percentage 30575 1726867670.82904: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.82904: done checking to see if all hosts have failed 30575 1726867670.82905: getting the remaining hosts for this loop 30575 1726867670.82906: done getting the remaining hosts for this loop 30575 1726867670.82910: getting the next task for host managed_node3 30575 1726867670.82918: done getting next task for host managed_node3 30575 1726867670.82922: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867670.82926: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.82942: getting variables 30575 1726867670.82944: in VariableManager get_vars() 30575 1726867670.82980: Calling all_inventory to load vars for managed_node3 30575 1726867670.82983: Calling groups_inventory to load vars for managed_node3 30575 1726867670.82985: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.82993: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.82995: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.82997: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.83889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.84745: done with get_vars() 30575 1726867670.84760: done getting variables 30575 1726867670.84803: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:50 -0400 (0:00:00.031) 0:01:46.225 ****** 30575 1726867670.84829: entering _queue_task() for managed_node3/copy 30575 1726867670.85061: worker is 1 (out of 1 available) 30575 1726867670.85075: exiting _queue_task() for managed_node3/copy 30575 1726867670.85091: done queuing things up, now waiting for results queue to drain 30575 1726867670.85092: waiting for pending results... 30575 1726867670.85286: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867670.85365: in run() - task 0affcac9-a3a5-e081-a588-0000000021b2 30575 1726867670.85379: variable 'ansible_search_path' from source: unknown 30575 1726867670.85383: variable 'ansible_search_path' from source: unknown 30575 1726867670.85410: calling self._execute() 30575 1726867670.85487: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.85492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.85500: variable 'omit' from source: magic vars 30575 1726867670.85783: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.85792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.85981: variable 'network_provider' from source: set_fact 30575 1726867670.85984: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867670.85986: when evaluation is False, skipping this task 30575 1726867670.85988: _execute() done 30575 1726867670.85989: dumping result to json 30575 1726867670.85991: done dumping result, returning 30575 1726867670.85994: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-0000000021b2] 30575 1726867670.85996: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b2 30575 1726867670.86058: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b2 30575 1726867670.86060: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867670.86113: no more pending results, returning what we have 30575 1726867670.86116: results queue empty 30575 1726867670.86117: checking for any_errors_fatal 30575 1726867670.86121: done checking for any_errors_fatal 30575 1726867670.86122: checking for max_fail_percentage 30575 1726867670.86123: done checking for max_fail_percentage 30575 1726867670.86124: checking to see if all hosts have failed and the running result is not ok 30575 1726867670.86125: done checking to see if all hosts have failed 30575 1726867670.86125: getting the remaining hosts for this loop 30575 1726867670.86126: done getting the remaining hosts for this loop 30575 1726867670.86129: getting the next task for host managed_node3 30575 1726867670.86135: done getting next task for host managed_node3 30575 1726867670.86138: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867670.86143: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867670.86155: getting variables 30575 1726867670.86156: in VariableManager get_vars() 30575 1726867670.86185: Calling all_inventory to load vars for managed_node3 30575 1726867670.86187: Calling groups_inventory to load vars for managed_node3 30575 1726867670.86188: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867670.86194: Calling all_plugins_play to load vars for managed_node3 30575 1726867670.86196: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867670.86197: Calling groups_plugins_play to load vars for managed_node3 30575 1726867670.86933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867670.87800: done with get_vars() 30575 1726867670.87816: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:50 -0400 (0:00:00.030) 0:01:46.256 ****** 30575 1726867670.87876: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867670.88108: worker is 1 (out of 1 available) 30575 1726867670.88122: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867670.88136: done queuing things up, now waiting for results queue to drain 30575 1726867670.88138: waiting for pending results... 30575 1726867670.88330: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867670.88415: in run() - task 0affcac9-a3a5-e081-a588-0000000021b3 30575 1726867670.88429: variable 'ansible_search_path' from source: unknown 30575 1726867670.88432: variable 'ansible_search_path' from source: unknown 30575 1726867670.88460: calling self._execute() 30575 1726867670.88537: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.88541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.88550: variable 'omit' from source: magic vars 30575 1726867670.88831: variable 'ansible_distribution_major_version' from source: facts 30575 1726867670.88840: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867670.88847: variable 'omit' from source: magic vars 30575 1726867670.88890: variable 'omit' from source: magic vars 30575 1726867670.89004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867670.90724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867670.90768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867670.90797: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867670.90825: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867670.90845: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867670.90904: variable 'network_provider' from source: set_fact 30575 1726867670.91000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867670.91022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867670.91040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867670.91066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867670.91078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867670.91134: variable 'omit' from source: magic vars 30575 1726867670.91206: variable 'omit' from source: magic vars 30575 1726867670.91275: variable 'network_connections' from source: include params 30575 1726867670.91285: variable 'interface' from source: play vars 30575 1726867670.91333: variable 'interface' from source: play vars 30575 1726867670.91444: variable 'omit' from source: magic vars 30575 1726867670.91450: variable '__lsr_ansible_managed' from source: task vars 30575 1726867670.91492: variable '__lsr_ansible_managed' from source: task vars 30575 1726867670.91612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867670.91762: Loaded config def from plugin (lookup/template) 30575 1726867670.91766: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867670.91788: File lookup term: get_ansible_managed.j2 30575 1726867670.91791: variable 'ansible_search_path' from source: unknown 30575 1726867670.91794: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867670.91805: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867670.91820: variable 'ansible_search_path' from source: unknown 30575 1726867670.95012: variable 'ansible_managed' from source: unknown 30575 1726867670.95097: variable 'omit' from source: magic vars 30575 1726867670.95114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867670.95134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867670.95148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867670.95161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867670.95168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867670.95191: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867670.95194: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.95198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.95261: Set connection var ansible_pipelining to False 30575 1726867670.95264: Set connection var ansible_shell_type to sh 30575 1726867670.95269: Set connection var ansible_shell_executable to /bin/sh 30575 1726867670.95274: Set connection var ansible_timeout to 10 30575 1726867670.95281: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867670.95287: Set connection var ansible_connection to ssh 30575 1726867670.95304: variable 'ansible_shell_executable' from source: unknown 30575 1726867670.95308: variable 'ansible_connection' from source: unknown 30575 1726867670.95311: variable 'ansible_module_compression' from source: unknown 30575 1726867670.95313: variable 'ansible_shell_type' from source: unknown 30575 1726867670.95316: variable 'ansible_shell_executable' from source: unknown 30575 1726867670.95319: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867670.95321: variable 'ansible_pipelining' from source: unknown 30575 1726867670.95331: variable 'ansible_timeout' from source: unknown 30575 1726867670.95334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867670.95414: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867670.95425: variable 'omit' from source: magic vars 30575 1726867670.95431: starting attempt loop 30575 1726867670.95434: running the handler 30575 1726867670.95444: _low_level_execute_command(): starting 30575 1726867670.95451: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867670.95960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.95964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867670.95966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.95968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.95970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.96028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867670.96031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867670.96033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.96093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867670.97772: stdout chunk (state=3): >>>/root <<< 30575 1726867670.97875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867670.97904: stderr chunk (state=3): >>><<< 30575 1726867670.97908: stdout chunk (state=3): >>><<< 30575 1726867670.97924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867670.97934: _low_level_execute_command(): starting 30575 1726867670.97939: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536 `" && echo ansible-tmp-1726867670.9792454-35492-109238538140536="` echo /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536 `" ) && sleep 0' 30575 1726867670.98354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867670.98358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.98371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867670.98423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867670.98427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867670.98481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.00362: stdout chunk (state=3): >>>ansible-tmp-1726867670.9792454-35492-109238538140536=/root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536 <<< 30575 1726867671.00472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.00496: stderr chunk (state=3): >>><<< 30575 1726867671.00499: stdout chunk (state=3): >>><<< 30575 1726867671.00511: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867670.9792454-35492-109238538140536=/root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867671.00548: variable 'ansible_module_compression' from source: unknown 30575 1726867671.00582: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867671.00624: variable 'ansible_facts' from source: unknown 30575 1726867671.00717: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/AnsiballZ_network_connections.py 30575 1726867671.00810: Sending initial data 30575 1726867671.00814: Sent initial data (168 bytes) 30575 1726867671.01248: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867671.01251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867671.01254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.01259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867671.01262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.01264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.01312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867671.01317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.01359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.02924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867671.02928: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867671.02963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867671.03015: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpjrfhp1tr /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/AnsiballZ_network_connections.py <<< 30575 1726867671.03019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/AnsiballZ_network_connections.py" <<< 30575 1726867671.03054: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpjrfhp1tr" to remote "/root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/AnsiballZ_network_connections.py" <<< 30575 1726867671.03755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.03791: stderr chunk (state=3): >>><<< 30575 1726867671.03795: stdout chunk (state=3): >>><<< 30575 1726867671.03831: done transferring module to remote 30575 1726867671.03839: _low_level_execute_command(): starting 30575 1726867671.03842: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/ /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/AnsiballZ_network_connections.py && sleep 0' 30575 1726867671.04239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867671.04243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867671.04246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.04295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867671.04298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.04347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.06091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.06111: stderr chunk (state=3): >>><<< 30575 1726867671.06115: stdout chunk (state=3): >>><<< 30575 1726867671.06127: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867671.06130: _low_level_execute_command(): starting 30575 1726867671.06133: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/AnsiballZ_network_connections.py && sleep 0' 30575 1726867671.06529: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.06532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.06534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867671.06536: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.06585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867671.06597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.06639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.34435: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867671.37355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867671.37376: stderr chunk (state=3): >>><<< 30575 1726867671.37381: stdout chunk (state=3): >>><<< 30575 1726867671.37399: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867671.37436: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867671.37439: _low_level_execute_command(): starting 30575 1726867671.37444: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867670.9792454-35492-109238538140536/ > /dev/null 2>&1 && sleep 0' 30575 1726867671.37866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.37869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.37872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867671.37874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867671.37876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.37925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867671.37928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.37981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.39813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.39838: stderr chunk (state=3): >>><<< 30575 1726867671.39842: stdout chunk (state=3): >>><<< 30575 1726867671.39854: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867671.39859: handler run complete 30575 1726867671.39883: attempt loop complete, returning result 30575 1726867671.39886: _execute() done 30575 1726867671.39888: dumping result to json 30575 1726867671.39893: done dumping result, returning 30575 1726867671.39901: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-0000000021b3] 30575 1726867671.39906: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b3 30575 1726867671.40011: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b3 30575 1726867671.40014: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6 30575 1726867671.40136: no more pending results, returning what we have 30575 1726867671.40139: results queue empty 30575 1726867671.40140: checking for any_errors_fatal 30575 1726867671.40150: done checking for any_errors_fatal 30575 1726867671.40150: checking for max_fail_percentage 30575 1726867671.40152: done checking for max_fail_percentage 30575 1726867671.40153: checking to see if all hosts have failed and the running result is not ok 30575 1726867671.40154: done checking to see if all hosts have failed 30575 1726867671.40154: getting the remaining hosts for this loop 30575 1726867671.40156: done getting the remaining hosts for this loop 30575 1726867671.40159: getting the next task for host managed_node3 30575 1726867671.40166: done getting next task for host managed_node3 30575 1726867671.40171: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867671.40175: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867671.40188: getting variables 30575 1726867671.40189: in VariableManager get_vars() 30575 1726867671.40230: Calling all_inventory to load vars for managed_node3 30575 1726867671.40232: Calling groups_inventory to load vars for managed_node3 30575 1726867671.40234: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867671.40243: Calling all_plugins_play to load vars for managed_node3 30575 1726867671.40246: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867671.40248: Calling groups_plugins_play to load vars for managed_node3 30575 1726867671.41207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867671.42058: done with get_vars() 30575 1726867671.42075: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:51 -0400 (0:00:00.542) 0:01:46.798 ****** 30575 1726867671.42140: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867671.42382: worker is 1 (out of 1 available) 30575 1726867671.42397: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867671.42411: done queuing things up, now waiting for results queue to drain 30575 1726867671.42413: waiting for pending results... 30575 1726867671.42606: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867671.42713: in run() - task 0affcac9-a3a5-e081-a588-0000000021b4 30575 1726867671.42727: variable 'ansible_search_path' from source: unknown 30575 1726867671.42730: variable 'ansible_search_path' from source: unknown 30575 1726867671.42761: calling self._execute() 30575 1726867671.42834: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.42838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.42847: variable 'omit' from source: magic vars 30575 1726867671.43119: variable 'ansible_distribution_major_version' from source: facts 30575 1726867671.43131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867671.43213: variable 'network_state' from source: role '' defaults 30575 1726867671.43224: Evaluated conditional (network_state != {}): False 30575 1726867671.43227: when evaluation is False, skipping this task 30575 1726867671.43230: _execute() done 30575 1726867671.43232: dumping result to json 30575 1726867671.43235: done dumping result, returning 30575 1726867671.43243: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-0000000021b4] 30575 1726867671.43247: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b4 30575 1726867671.43332: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b4 30575 1726867671.43334: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867671.43387: no more pending results, returning what we have 30575 1726867671.43391: results queue empty 30575 1726867671.43392: checking for any_errors_fatal 30575 1726867671.43405: done checking for any_errors_fatal 30575 1726867671.43406: checking for max_fail_percentage 30575 1726867671.43407: done checking for max_fail_percentage 30575 1726867671.43408: checking to see if all hosts have failed and the running result is not ok 30575 1726867671.43409: done checking to see if all hosts have failed 30575 1726867671.43410: getting the remaining hosts for this loop 30575 1726867671.43411: done getting the remaining hosts for this loop 30575 1726867671.43415: getting the next task for host managed_node3 30575 1726867671.43423: done getting next task for host managed_node3 30575 1726867671.43426: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867671.43430: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867671.43449: getting variables 30575 1726867671.43451: in VariableManager get_vars() 30575 1726867671.43490: Calling all_inventory to load vars for managed_node3 30575 1726867671.43493: Calling groups_inventory to load vars for managed_node3 30575 1726867671.43495: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867671.43504: Calling all_plugins_play to load vars for managed_node3 30575 1726867671.43506: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867671.43509: Calling groups_plugins_play to load vars for managed_node3 30575 1726867671.44262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867671.45114: done with get_vars() 30575 1726867671.45130: done getting variables 30575 1726867671.45169: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:51 -0400 (0:00:00.030) 0:01:46.829 ****** 30575 1726867671.45196: entering _queue_task() for managed_node3/debug 30575 1726867671.45405: worker is 1 (out of 1 available) 30575 1726867671.45419: exiting _queue_task() for managed_node3/debug 30575 1726867671.45432: done queuing things up, now waiting for results queue to drain 30575 1726867671.45433: waiting for pending results... 30575 1726867671.45793: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867671.45798: in run() - task 0affcac9-a3a5-e081-a588-0000000021b5 30575 1726867671.45803: variable 'ansible_search_path' from source: unknown 30575 1726867671.45812: variable 'ansible_search_path' from source: unknown 30575 1726867671.45855: calling self._execute() 30575 1726867671.45955: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.45973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.45991: variable 'omit' from source: magic vars 30575 1726867671.46379: variable 'ansible_distribution_major_version' from source: facts 30575 1726867671.46403: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867671.46420: variable 'omit' from source: magic vars 30575 1726867671.46511: variable 'omit' from source: magic vars 30575 1726867671.46542: variable 'omit' from source: magic vars 30575 1726867671.46622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867671.46636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867671.46666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867671.46692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867671.46713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867671.46782: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867671.46786: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.46788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.46888: Set connection var ansible_pipelining to False 30575 1726867671.46899: Set connection var ansible_shell_type to sh 30575 1726867671.46982: Set connection var ansible_shell_executable to /bin/sh 30575 1726867671.46986: Set connection var ansible_timeout to 10 30575 1726867671.46988: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867671.46990: Set connection var ansible_connection to ssh 30575 1726867671.46992: variable 'ansible_shell_executable' from source: unknown 30575 1726867671.46994: variable 'ansible_connection' from source: unknown 30575 1726867671.46996: variable 'ansible_module_compression' from source: unknown 30575 1726867671.46999: variable 'ansible_shell_type' from source: unknown 30575 1726867671.47001: variable 'ansible_shell_executable' from source: unknown 30575 1726867671.47003: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.47005: variable 'ansible_pipelining' from source: unknown 30575 1726867671.47007: variable 'ansible_timeout' from source: unknown 30575 1726867671.47022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.47171: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867671.47232: variable 'omit' from source: magic vars 30575 1726867671.47235: starting attempt loop 30575 1726867671.47237: running the handler 30575 1726867671.47325: variable '__network_connections_result' from source: set_fact 30575 1726867671.47370: handler run complete 30575 1726867671.47387: attempt loop complete, returning result 30575 1726867671.47390: _execute() done 30575 1726867671.47393: dumping result to json 30575 1726867671.47395: done dumping result, returning 30575 1726867671.47403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-0000000021b5] 30575 1726867671.47408: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b5 30575 1726867671.47498: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b5 30575 1726867671.47502: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6" ] } 30575 1726867671.47569: no more pending results, returning what we have 30575 1726867671.47573: results queue empty 30575 1726867671.47574: checking for any_errors_fatal 30575 1726867671.47580: done checking for any_errors_fatal 30575 1726867671.47581: checking for max_fail_percentage 30575 1726867671.47583: done checking for max_fail_percentage 30575 1726867671.47584: checking to see if all hosts have failed and the running result is not ok 30575 1726867671.47585: done checking to see if all hosts have failed 30575 1726867671.47585: getting the remaining hosts for this loop 30575 1726867671.47587: done getting the remaining hosts for this loop 30575 1726867671.47590: getting the next task for host managed_node3 30575 1726867671.47597: done getting next task for host managed_node3 30575 1726867671.47601: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867671.47605: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867671.47615: getting variables 30575 1726867671.47619: in VariableManager get_vars() 30575 1726867671.47655: Calling all_inventory to load vars for managed_node3 30575 1726867671.47657: Calling groups_inventory to load vars for managed_node3 30575 1726867671.47660: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867671.47668: Calling all_plugins_play to load vars for managed_node3 30575 1726867671.47671: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867671.47674: Calling groups_plugins_play to load vars for managed_node3 30575 1726867671.48864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867671.50343: done with get_vars() 30575 1726867671.50370: done getting variables 30575 1726867671.50433: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:51 -0400 (0:00:00.052) 0:01:46.882 ****** 30575 1726867671.50474: entering _queue_task() for managed_node3/debug 30575 1726867671.50837: worker is 1 (out of 1 available) 30575 1726867671.50851: exiting _queue_task() for managed_node3/debug 30575 1726867671.50866: done queuing things up, now waiting for results queue to drain 30575 1726867671.50868: waiting for pending results... 30575 1726867671.51206: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867671.51484: in run() - task 0affcac9-a3a5-e081-a588-0000000021b6 30575 1726867671.51487: variable 'ansible_search_path' from source: unknown 30575 1726867671.51490: variable 'ansible_search_path' from source: unknown 30575 1726867671.51493: calling self._execute() 30575 1726867671.51540: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.51552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.51567: variable 'omit' from source: magic vars 30575 1726867671.51989: variable 'ansible_distribution_major_version' from source: facts 30575 1726867671.52008: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867671.52013: variable 'omit' from source: magic vars 30575 1726867671.52067: variable 'omit' from source: magic vars 30575 1726867671.52094: variable 'omit' from source: magic vars 30575 1726867671.52147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867671.52167: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867671.52187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867671.52201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867671.52215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867671.52239: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867671.52242: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.52245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.52316: Set connection var ansible_pipelining to False 30575 1726867671.52321: Set connection var ansible_shell_type to sh 30575 1726867671.52372: Set connection var ansible_shell_executable to /bin/sh 30575 1726867671.52375: Set connection var ansible_timeout to 10 30575 1726867671.52381: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867671.52384: Set connection var ansible_connection to ssh 30575 1726867671.52386: variable 'ansible_shell_executable' from source: unknown 30575 1726867671.52388: variable 'ansible_connection' from source: unknown 30575 1726867671.52391: variable 'ansible_module_compression' from source: unknown 30575 1726867671.52393: variable 'ansible_shell_type' from source: unknown 30575 1726867671.52395: variable 'ansible_shell_executable' from source: unknown 30575 1726867671.52396: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.52398: variable 'ansible_pipelining' from source: unknown 30575 1726867671.52400: variable 'ansible_timeout' from source: unknown 30575 1726867671.52402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.52480: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867671.52487: variable 'omit' from source: magic vars 30575 1726867671.52493: starting attempt loop 30575 1726867671.52496: running the handler 30575 1726867671.52540: variable '__network_connections_result' from source: set_fact 30575 1726867671.52598: variable '__network_connections_result' from source: set_fact 30575 1726867671.52685: handler run complete 30575 1726867671.52705: attempt loop complete, returning result 30575 1726867671.52708: _execute() done 30575 1726867671.52710: dumping result to json 30575 1726867671.52714: done dumping result, returning 30575 1726867671.52725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-0000000021b6] 30575 1726867671.52732: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b6 30575 1726867671.52821: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b6 30575 1726867671.52824: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6" ] } } 30575 1726867671.52913: no more pending results, returning what we have 30575 1726867671.52917: results queue empty 30575 1726867671.52918: checking for any_errors_fatal 30575 1726867671.52925: done checking for any_errors_fatal 30575 1726867671.52925: checking for max_fail_percentage 30575 1726867671.52927: done checking for max_fail_percentage 30575 1726867671.52928: checking to see if all hosts have failed and the running result is not ok 30575 1726867671.52929: done checking to see if all hosts have failed 30575 1726867671.52930: getting the remaining hosts for this loop 30575 1726867671.52932: done getting the remaining hosts for this loop 30575 1726867671.52937: getting the next task for host managed_node3 30575 1726867671.52945: done getting next task for host managed_node3 30575 1726867671.52949: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867671.52953: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867671.52965: getting variables 30575 1726867671.52966: in VariableManager get_vars() 30575 1726867671.53015: Calling all_inventory to load vars for managed_node3 30575 1726867671.53017: Calling groups_inventory to load vars for managed_node3 30575 1726867671.53019: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867671.53028: Calling all_plugins_play to load vars for managed_node3 30575 1726867671.53030: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867671.53033: Calling groups_plugins_play to load vars for managed_node3 30575 1726867671.53952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867671.60534: done with get_vars() 30575 1726867671.60562: done getting variables 30575 1726867671.60613: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:51 -0400 (0:00:00.101) 0:01:46.983 ****** 30575 1726867671.60644: entering _queue_task() for managed_node3/debug 30575 1726867671.61023: worker is 1 (out of 1 available) 30575 1726867671.61038: exiting _queue_task() for managed_node3/debug 30575 1726867671.61051: done queuing things up, now waiting for results queue to drain 30575 1726867671.61053: waiting for pending results... 30575 1726867671.61400: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867671.61490: in run() - task 0affcac9-a3a5-e081-a588-0000000021b7 30575 1726867671.61583: variable 'ansible_search_path' from source: unknown 30575 1726867671.61587: variable 'ansible_search_path' from source: unknown 30575 1726867671.61590: calling self._execute() 30575 1726867671.61662: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.61680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.61699: variable 'omit' from source: magic vars 30575 1726867671.62108: variable 'ansible_distribution_major_version' from source: facts 30575 1726867671.62128: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867671.62262: variable 'network_state' from source: role '' defaults 30575 1726867671.62285: Evaluated conditional (network_state != {}): False 30575 1726867671.62293: when evaluation is False, skipping this task 30575 1726867671.62300: _execute() done 30575 1726867671.62308: dumping result to json 30575 1726867671.62318: done dumping result, returning 30575 1726867671.62364: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-0000000021b7] 30575 1726867671.62369: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b7 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867671.62509: no more pending results, returning what we have 30575 1726867671.62514: results queue empty 30575 1726867671.62514: checking for any_errors_fatal 30575 1726867671.62525: done checking for any_errors_fatal 30575 1726867671.62526: checking for max_fail_percentage 30575 1726867671.62527: done checking for max_fail_percentage 30575 1726867671.62528: checking to see if all hosts have failed and the running result is not ok 30575 1726867671.62529: done checking to see if all hosts have failed 30575 1726867671.62530: getting the remaining hosts for this loop 30575 1726867671.62531: done getting the remaining hosts for this loop 30575 1726867671.62535: getting the next task for host managed_node3 30575 1726867671.62544: done getting next task for host managed_node3 30575 1726867671.62547: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867671.62553: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867671.62575: getting variables 30575 1726867671.62578: in VariableManager get_vars() 30575 1726867671.62624: Calling all_inventory to load vars for managed_node3 30575 1726867671.62627: Calling groups_inventory to load vars for managed_node3 30575 1726867671.62629: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867671.62642: Calling all_plugins_play to load vars for managed_node3 30575 1726867671.62645: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867671.62648: Calling groups_plugins_play to load vars for managed_node3 30575 1726867671.63185: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b7 30575 1726867671.63189: WORKER PROCESS EXITING 30575 1726867671.64178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867671.65763: done with get_vars() 30575 1726867671.65787: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:51 -0400 (0:00:00.052) 0:01:47.036 ****** 30575 1726867671.65888: entering _queue_task() for managed_node3/ping 30575 1726867671.66245: worker is 1 (out of 1 available) 30575 1726867671.66259: exiting _queue_task() for managed_node3/ping 30575 1726867671.66278: done queuing things up, now waiting for results queue to drain 30575 1726867671.66280: waiting for pending results... 30575 1726867671.66593: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867671.66754: in run() - task 0affcac9-a3a5-e081-a588-0000000021b8 30575 1726867671.66885: variable 'ansible_search_path' from source: unknown 30575 1726867671.66889: variable 'ansible_search_path' from source: unknown 30575 1726867671.66893: calling self._execute() 30575 1726867671.66916: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.66929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.66941: variable 'omit' from source: magic vars 30575 1726867671.67300: variable 'ansible_distribution_major_version' from source: facts 30575 1726867671.67310: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867671.67315: variable 'omit' from source: magic vars 30575 1726867671.67361: variable 'omit' from source: magic vars 30575 1726867671.67387: variable 'omit' from source: magic vars 30575 1726867671.67420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867671.67448: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867671.67473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867671.67493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867671.67501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867671.67525: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867671.67529: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.67531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.67600: Set connection var ansible_pipelining to False 30575 1726867671.67603: Set connection var ansible_shell_type to sh 30575 1726867671.67608: Set connection var ansible_shell_executable to /bin/sh 30575 1726867671.67613: Set connection var ansible_timeout to 10 30575 1726867671.67620: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867671.67625: Set connection var ansible_connection to ssh 30575 1726867671.67645: variable 'ansible_shell_executable' from source: unknown 30575 1726867671.67648: variable 'ansible_connection' from source: unknown 30575 1726867671.67650: variable 'ansible_module_compression' from source: unknown 30575 1726867671.67653: variable 'ansible_shell_type' from source: unknown 30575 1726867671.67655: variable 'ansible_shell_executable' from source: unknown 30575 1726867671.67657: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867671.67659: variable 'ansible_pipelining' from source: unknown 30575 1726867671.67663: variable 'ansible_timeout' from source: unknown 30575 1726867671.67670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867671.67809: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867671.67821: variable 'omit' from source: magic vars 30575 1726867671.67824: starting attempt loop 30575 1726867671.67827: running the handler 30575 1726867671.67837: _low_level_execute_command(): starting 30575 1726867671.67844: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867671.68351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.68355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.68358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867671.68361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.68412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867671.68415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867671.68418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.68476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.70149: stdout chunk (state=3): >>>/root <<< 30575 1726867671.70251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.70275: stderr chunk (state=3): >>><<< 30575 1726867671.70281: stdout chunk (state=3): >>><<< 30575 1726867671.70487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867671.70492: _low_level_execute_command(): starting 30575 1726867671.70495: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255 `" && echo ansible-tmp-1726867671.7030952-35520-207142190029255="` echo /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255 `" ) && sleep 0' 30575 1726867671.70894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867671.70910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867671.70940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.70973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867671.70989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.71043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.72936: stdout chunk (state=3): >>>ansible-tmp-1726867671.7030952-35520-207142190029255=/root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255 <<< 30575 1726867671.73047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.73075: stderr chunk (state=3): >>><<< 30575 1726867671.73081: stdout chunk (state=3): >>><<< 30575 1726867671.73097: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867671.7030952-35520-207142190029255=/root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867671.73134: variable 'ansible_module_compression' from source: unknown 30575 1726867671.73166: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867671.73200: variable 'ansible_facts' from source: unknown 30575 1726867671.73253: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/AnsiballZ_ping.py 30575 1726867671.73352: Sending initial data 30575 1726867671.73356: Sent initial data (153 bytes) 30575 1726867671.73768: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867671.73772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867671.73806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867671.73809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867671.73811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.73814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.73869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867671.73872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867671.73886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.73923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.75441: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867671.75481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867671.75535: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp52i9h_6c /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/AnsiballZ_ping.py <<< 30575 1726867671.75538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/AnsiballZ_ping.py" <<< 30575 1726867671.75570: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp52i9h_6c" to remote "/root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/AnsiballZ_ping.py" <<< 30575 1726867671.75578: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/AnsiballZ_ping.py" <<< 30575 1726867671.76097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.76142: stderr chunk (state=3): >>><<< 30575 1726867671.76145: stdout chunk (state=3): >>><<< 30575 1726867671.76189: done transferring module to remote 30575 1726867671.76199: _low_level_execute_command(): starting 30575 1726867671.76203: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/ /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/AnsiballZ_ping.py && sleep 0' 30575 1726867671.76653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867671.76656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867671.76659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.76661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867671.76664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.76722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.76773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.78545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.78570: stderr chunk (state=3): >>><<< 30575 1726867671.78573: stdout chunk (state=3): >>><<< 30575 1726867671.78589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867671.78593: _low_level_execute_command(): starting 30575 1726867671.78596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/AnsiballZ_ping.py && sleep 0' 30575 1726867671.79037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.79041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867671.79043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867671.79046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867671.79049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.79093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867671.79096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.79151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.94028: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867671.95311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867671.95342: stderr chunk (state=3): >>><<< 30575 1726867671.95345: stdout chunk (state=3): >>><<< 30575 1726867671.95361: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867671.95390: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867671.95400: _low_level_execute_command(): starting 30575 1726867671.95407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867671.7030952-35520-207142190029255/ > /dev/null 2>&1 && sleep 0' 30575 1726867671.95858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.95862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.95864: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867671.95866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867671.95920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867671.95924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867671.95930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867671.95975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867671.97801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867671.97841: stderr chunk (state=3): >>><<< 30575 1726867671.97844: stdout chunk (state=3): >>><<< 30575 1726867671.97866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867671.97870: handler run complete 30575 1726867671.97887: attempt loop complete, returning result 30575 1726867671.97890: _execute() done 30575 1726867671.97893: dumping result to json 30575 1726867671.97895: done dumping result, returning 30575 1726867671.97903: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-0000000021b8] 30575 1726867671.97908: sending task result for task 0affcac9-a3a5-e081-a588-0000000021b8 30575 1726867671.98000: done sending task result for task 0affcac9-a3a5-e081-a588-0000000021b8 30575 1726867671.98004: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867671.98064: no more pending results, returning what we have 30575 1726867671.98067: results queue empty 30575 1726867671.98068: checking for any_errors_fatal 30575 1726867671.98073: done checking for any_errors_fatal 30575 1726867671.98074: checking for max_fail_percentage 30575 1726867671.98075: done checking for max_fail_percentage 30575 1726867671.98076: checking to see if all hosts have failed and the running result is not ok 30575 1726867671.98078: done checking to see if all hosts have failed 30575 1726867671.98079: getting the remaining hosts for this loop 30575 1726867671.98080: done getting the remaining hosts for this loop 30575 1726867671.98084: getting the next task for host managed_node3 30575 1726867671.98095: done getting next task for host managed_node3 30575 1726867671.98097: ^ task is: TASK: meta (role_complete) 30575 1726867671.98102: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867671.98114: getting variables 30575 1726867671.98116: in VariableManager get_vars() 30575 1726867671.98161: Calling all_inventory to load vars for managed_node3 30575 1726867671.98164: Calling groups_inventory to load vars for managed_node3 30575 1726867671.98166: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867671.98175: Calling all_plugins_play to load vars for managed_node3 30575 1726867671.98183: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867671.98186: Calling groups_plugins_play to load vars for managed_node3 30575 1726867671.99161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.00006: done with get_vars() 30575 1726867672.00022: done getting variables 30575 1726867672.00081: done queuing things up, now waiting for results queue to drain 30575 1726867672.00082: results queue empty 30575 1726867672.00083: checking for any_errors_fatal 30575 1726867672.00084: done checking for any_errors_fatal 30575 1726867672.00085: checking for max_fail_percentage 30575 1726867672.00085: done checking for max_fail_percentage 30575 1726867672.00086: checking to see if all hosts have failed and the running result is not ok 30575 1726867672.00086: done checking to see if all hosts have failed 30575 1726867672.00087: getting the remaining hosts for this loop 30575 1726867672.00087: done getting the remaining hosts for this loop 30575 1726867672.00089: getting the next task for host managed_node3 30575 1726867672.00092: done getting next task for host managed_node3 30575 1726867672.00093: ^ task is: TASK: Show result 30575 1726867672.00095: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867672.00096: getting variables 30575 1726867672.00097: in VariableManager get_vars() 30575 1726867672.00105: Calling all_inventory to load vars for managed_node3 30575 1726867672.00106: Calling groups_inventory to load vars for managed_node3 30575 1726867672.00108: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.00111: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.00112: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.00114: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.00738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.01588: done with get_vars() 30575 1726867672.01601: done getting variables 30575 1726867672.01629: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 17:27:52 -0400 (0:00:00.357) 0:01:47.394 ****** 30575 1726867672.01652: entering _queue_task() for managed_node3/debug 30575 1726867672.01892: worker is 1 (out of 1 available) 30575 1726867672.01907: exiting _queue_task() for managed_node3/debug 30575 1726867672.01919: done queuing things up, now waiting for results queue to drain 30575 1726867672.01921: waiting for pending results... 30575 1726867672.02108: running TaskExecutor() for managed_node3/TASK: Show result 30575 1726867672.02204: in run() - task 0affcac9-a3a5-e081-a588-00000000213a 30575 1726867672.02216: variable 'ansible_search_path' from source: unknown 30575 1726867672.02220: variable 'ansible_search_path' from source: unknown 30575 1726867672.02252: calling self._execute() 30575 1726867672.02325: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.02329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.02338: variable 'omit' from source: magic vars 30575 1726867672.02637: variable 'ansible_distribution_major_version' from source: facts 30575 1726867672.02647: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867672.02653: variable 'omit' from source: magic vars 30575 1726867672.02685: variable 'omit' from source: magic vars 30575 1726867672.02711: variable 'omit' from source: magic vars 30575 1726867672.02741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867672.02768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867672.02785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867672.02803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867672.02810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867672.02834: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867672.02838: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.02840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.02911: Set connection var ansible_pipelining to False 30575 1726867672.02914: Set connection var ansible_shell_type to sh 30575 1726867672.02919: Set connection var ansible_shell_executable to /bin/sh 30575 1726867672.02921: Set connection var ansible_timeout to 10 30575 1726867672.02927: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867672.02934: Set connection var ansible_connection to ssh 30575 1726867672.02951: variable 'ansible_shell_executable' from source: unknown 30575 1726867672.02954: variable 'ansible_connection' from source: unknown 30575 1726867672.02957: variable 'ansible_module_compression' from source: unknown 30575 1726867672.02959: variable 'ansible_shell_type' from source: unknown 30575 1726867672.02962: variable 'ansible_shell_executable' from source: unknown 30575 1726867672.02964: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.02968: variable 'ansible_pipelining' from source: unknown 30575 1726867672.02970: variable 'ansible_timeout' from source: unknown 30575 1726867672.02974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.03283: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867672.03288: variable 'omit' from source: magic vars 30575 1726867672.03290: starting attempt loop 30575 1726867672.03292: running the handler 30575 1726867672.03294: variable '__network_connections_result' from source: set_fact 30575 1726867672.03296: variable '__network_connections_result' from source: set_fact 30575 1726867672.03376: handler run complete 30575 1726867672.03416: attempt loop complete, returning result 30575 1726867672.03425: _execute() done 30575 1726867672.03432: dumping result to json 30575 1726867672.03442: done dumping result, returning 30575 1726867672.03453: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-e081-a588-00000000213a] 30575 1726867672.03462: sending task result for task 0affcac9-a3a5-e081-a588-00000000213a ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6" ] } } 30575 1726867672.03631: no more pending results, returning what we have 30575 1726867672.03635: results queue empty 30575 1726867672.03636: checking for any_errors_fatal 30575 1726867672.03638: done checking for any_errors_fatal 30575 1726867672.03639: checking for max_fail_percentage 30575 1726867672.03640: done checking for max_fail_percentage 30575 1726867672.03641: checking to see if all hosts have failed and the running result is not ok 30575 1726867672.03643: done checking to see if all hosts have failed 30575 1726867672.03644: getting the remaining hosts for this loop 30575 1726867672.03646: done getting the remaining hosts for this loop 30575 1726867672.03650: getting the next task for host managed_node3 30575 1726867672.03660: done getting next task for host managed_node3 30575 1726867672.03665: ^ task is: TASK: Include network role 30575 1726867672.03669: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867672.03674: getting variables 30575 1726867672.03676: in VariableManager get_vars() 30575 1726867672.03719: Calling all_inventory to load vars for managed_node3 30575 1726867672.03721: Calling groups_inventory to load vars for managed_node3 30575 1726867672.03725: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.03736: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.03739: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.03742: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.04638: done sending task result for task 0affcac9-a3a5-e081-a588-00000000213a 30575 1726867672.04642: WORKER PROCESS EXITING 30575 1726867672.04893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.05758: done with get_vars() 30575 1726867672.05771: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 17:27:52 -0400 (0:00:00.041) 0:01:47.435 ****** 30575 1726867672.05837: entering _queue_task() for managed_node3/include_role 30575 1726867672.06081: worker is 1 (out of 1 available) 30575 1726867672.06092: exiting _queue_task() for managed_node3/include_role 30575 1726867672.06105: done queuing things up, now waiting for results queue to drain 30575 1726867672.06106: waiting for pending results... 30575 1726867672.06359: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867672.06457: in run() - task 0affcac9-a3a5-e081-a588-00000000213e 30575 1726867672.06471: variable 'ansible_search_path' from source: unknown 30575 1726867672.06475: variable 'ansible_search_path' from source: unknown 30575 1726867672.06506: calling self._execute() 30575 1726867672.06581: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.06585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.06594: variable 'omit' from source: magic vars 30575 1726867672.06881: variable 'ansible_distribution_major_version' from source: facts 30575 1726867672.06891: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867672.06897: _execute() done 30575 1726867672.06900: dumping result to json 30575 1726867672.06905: done dumping result, returning 30575 1726867672.06912: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-00000000213e] 30575 1726867672.06917: sending task result for task 0affcac9-a3a5-e081-a588-00000000213e 30575 1726867672.07023: done sending task result for task 0affcac9-a3a5-e081-a588-00000000213e 30575 1726867672.07026: WORKER PROCESS EXITING 30575 1726867672.07052: no more pending results, returning what we have 30575 1726867672.07057: in VariableManager get_vars() 30575 1726867672.07107: Calling all_inventory to load vars for managed_node3 30575 1726867672.07110: Calling groups_inventory to load vars for managed_node3 30575 1726867672.07114: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.07125: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.07127: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.07130: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.08135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.08990: done with get_vars() 30575 1726867672.09003: variable 'ansible_search_path' from source: unknown 30575 1726867672.09004: variable 'ansible_search_path' from source: unknown 30575 1726867672.09091: variable 'omit' from source: magic vars 30575 1726867672.09117: variable 'omit' from source: magic vars 30575 1726867672.09126: variable 'omit' from source: magic vars 30575 1726867672.09128: we have included files to process 30575 1726867672.09129: generating all_blocks data 30575 1726867672.09130: done generating all_blocks data 30575 1726867672.09134: processing included file: fedora.linux_system_roles.network 30575 1726867672.09146: in VariableManager get_vars() 30575 1726867672.09156: done with get_vars() 30575 1726867672.09173: in VariableManager get_vars() 30575 1726867672.09186: done with get_vars() 30575 1726867672.09213: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867672.09283: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867672.09332: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867672.09670: in VariableManager get_vars() 30575 1726867672.09686: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867672.10866: iterating over new_blocks loaded from include file 30575 1726867672.10867: in VariableManager get_vars() 30575 1726867672.10880: done with get_vars() 30575 1726867672.10881: filtering new block on tags 30575 1726867672.11031: done filtering new block on tags 30575 1726867672.11033: in VariableManager get_vars() 30575 1726867672.11045: done with get_vars() 30575 1726867672.11046: filtering new block on tags 30575 1726867672.11056: done filtering new block on tags 30575 1726867672.11057: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867672.11061: extending task lists for all hosts with included blocks 30575 1726867672.11123: done extending task lists 30575 1726867672.11123: done processing included files 30575 1726867672.11124: results queue empty 30575 1726867672.11124: checking for any_errors_fatal 30575 1726867672.11128: done checking for any_errors_fatal 30575 1726867672.11128: checking for max_fail_percentage 30575 1726867672.11129: done checking for max_fail_percentage 30575 1726867672.11129: checking to see if all hosts have failed and the running result is not ok 30575 1726867672.11130: done checking to see if all hosts have failed 30575 1726867672.11130: getting the remaining hosts for this loop 30575 1726867672.11131: done getting the remaining hosts for this loop 30575 1726867672.11133: getting the next task for host managed_node3 30575 1726867672.11136: done getting next task for host managed_node3 30575 1726867672.11137: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867672.11139: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867672.11146: getting variables 30575 1726867672.11147: in VariableManager get_vars() 30575 1726867672.11157: Calling all_inventory to load vars for managed_node3 30575 1726867672.11158: Calling groups_inventory to load vars for managed_node3 30575 1726867672.11159: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.11163: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.11164: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.11166: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.11771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.12622: done with get_vars() 30575 1726867672.12642: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:52 -0400 (0:00:00.068) 0:01:47.504 ****** 30575 1726867672.12691: entering _queue_task() for managed_node3/include_tasks 30575 1726867672.12919: worker is 1 (out of 1 available) 30575 1726867672.12932: exiting _queue_task() for managed_node3/include_tasks 30575 1726867672.12945: done queuing things up, now waiting for results queue to drain 30575 1726867672.12946: waiting for pending results... 30575 1726867672.13126: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867672.13214: in run() - task 0affcac9-a3a5-e081-a588-000000002328 30575 1726867672.13228: variable 'ansible_search_path' from source: unknown 30575 1726867672.13231: variable 'ansible_search_path' from source: unknown 30575 1726867672.13258: calling self._execute() 30575 1726867672.13334: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.13339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.13346: variable 'omit' from source: magic vars 30575 1726867672.13617: variable 'ansible_distribution_major_version' from source: facts 30575 1726867672.13630: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867672.13633: _execute() done 30575 1726867672.13638: dumping result to json 30575 1726867672.13641: done dumping result, returning 30575 1726867672.13649: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-000000002328] 30575 1726867672.13654: sending task result for task 0affcac9-a3a5-e081-a588-000000002328 30575 1726867672.13734: done sending task result for task 0affcac9-a3a5-e081-a588-000000002328 30575 1726867672.13737: WORKER PROCESS EXITING 30575 1726867672.13788: no more pending results, returning what we have 30575 1726867672.13794: in VariableManager get_vars() 30575 1726867672.13844: Calling all_inventory to load vars for managed_node3 30575 1726867672.13849: Calling groups_inventory to load vars for managed_node3 30575 1726867672.13851: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.13861: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.13864: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.13866: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.15123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.16353: done with get_vars() 30575 1726867672.16366: variable 'ansible_search_path' from source: unknown 30575 1726867672.16366: variable 'ansible_search_path' from source: unknown 30575 1726867672.16391: we have included files to process 30575 1726867672.16392: generating all_blocks data 30575 1726867672.16393: done generating all_blocks data 30575 1726867672.16395: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867672.16395: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867672.16397: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867672.16742: done processing included file 30575 1726867672.16744: iterating over new_blocks loaded from include file 30575 1726867672.16744: in VariableManager get_vars() 30575 1726867672.16759: done with get_vars() 30575 1726867672.16760: filtering new block on tags 30575 1726867672.16779: done filtering new block on tags 30575 1726867672.16781: in VariableManager get_vars() 30575 1726867672.16796: done with get_vars() 30575 1726867672.16797: filtering new block on tags 30575 1726867672.16822: done filtering new block on tags 30575 1726867672.16824: in VariableManager get_vars() 30575 1726867672.16837: done with get_vars() 30575 1726867672.16838: filtering new block on tags 30575 1726867672.16862: done filtering new block on tags 30575 1726867672.16863: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867672.16867: extending task lists for all hosts with included blocks 30575 1726867672.17786: done extending task lists 30575 1726867672.17787: done processing included files 30575 1726867672.17788: results queue empty 30575 1726867672.17788: checking for any_errors_fatal 30575 1726867672.17791: done checking for any_errors_fatal 30575 1726867672.17791: checking for max_fail_percentage 30575 1726867672.17792: done checking for max_fail_percentage 30575 1726867672.17792: checking to see if all hosts have failed and the running result is not ok 30575 1726867672.17793: done checking to see if all hosts have failed 30575 1726867672.17793: getting the remaining hosts for this loop 30575 1726867672.17794: done getting the remaining hosts for this loop 30575 1726867672.17796: getting the next task for host managed_node3 30575 1726867672.17799: done getting next task for host managed_node3 30575 1726867672.17800: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867672.17803: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867672.17810: getting variables 30575 1726867672.17810: in VariableManager get_vars() 30575 1726867672.17820: Calling all_inventory to load vars for managed_node3 30575 1726867672.17821: Calling groups_inventory to load vars for managed_node3 30575 1726867672.17822: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.17825: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.17827: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.17828: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.18425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.19300: done with get_vars() 30575 1726867672.19313: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:52 -0400 (0:00:00.066) 0:01:47.571 ****** 30575 1726867672.19360: entering _queue_task() for managed_node3/setup 30575 1726867672.19571: worker is 1 (out of 1 available) 30575 1726867672.19585: exiting _queue_task() for managed_node3/setup 30575 1726867672.19597: done queuing things up, now waiting for results queue to drain 30575 1726867672.19599: waiting for pending results... 30575 1726867672.19774: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867672.19874: in run() - task 0affcac9-a3a5-e081-a588-00000000237f 30575 1726867672.19887: variable 'ansible_search_path' from source: unknown 30575 1726867672.19891: variable 'ansible_search_path' from source: unknown 30575 1726867672.19917: calling self._execute() 30575 1726867672.19988: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.19993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.20000: variable 'omit' from source: magic vars 30575 1726867672.20264: variable 'ansible_distribution_major_version' from source: facts 30575 1726867672.20273: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867672.20422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867672.21890: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867672.21935: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867672.21962: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867672.21990: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867672.22011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867672.22069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867672.22091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867672.22109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867672.22140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867672.22151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867672.22188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867672.22204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867672.22227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867672.22250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867672.22260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867672.22370: variable '__network_required_facts' from source: role '' defaults 30575 1726867672.22379: variable 'ansible_facts' from source: unknown 30575 1726867672.22814: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867672.22818: when evaluation is False, skipping this task 30575 1726867672.22820: _execute() done 30575 1726867672.22826: dumping result to json 30575 1726867672.22828: done dumping result, returning 30575 1726867672.22836: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-00000000237f] 30575 1726867672.22841: sending task result for task 0affcac9-a3a5-e081-a588-00000000237f 30575 1726867672.22919: done sending task result for task 0affcac9-a3a5-e081-a588-00000000237f 30575 1726867672.22922: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867672.22963: no more pending results, returning what we have 30575 1726867672.22967: results queue empty 30575 1726867672.22968: checking for any_errors_fatal 30575 1726867672.22969: done checking for any_errors_fatal 30575 1726867672.22970: checking for max_fail_percentage 30575 1726867672.22971: done checking for max_fail_percentage 30575 1726867672.22972: checking to see if all hosts have failed and the running result is not ok 30575 1726867672.22973: done checking to see if all hosts have failed 30575 1726867672.22974: getting the remaining hosts for this loop 30575 1726867672.22975: done getting the remaining hosts for this loop 30575 1726867672.22981: getting the next task for host managed_node3 30575 1726867672.22992: done getting next task for host managed_node3 30575 1726867672.22995: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867672.23001: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867672.23023: getting variables 30575 1726867672.23025: in VariableManager get_vars() 30575 1726867672.23064: Calling all_inventory to load vars for managed_node3 30575 1726867672.23067: Calling groups_inventory to load vars for managed_node3 30575 1726867672.23069: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.23082: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.23085: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.23094: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.23835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.24691: done with get_vars() 30575 1726867672.24706: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:52 -0400 (0:00:00.054) 0:01:47.625 ****** 30575 1726867672.24769: entering _queue_task() for managed_node3/stat 30575 1726867672.24967: worker is 1 (out of 1 available) 30575 1726867672.24984: exiting _queue_task() for managed_node3/stat 30575 1726867672.24995: done queuing things up, now waiting for results queue to drain 30575 1726867672.24997: waiting for pending results... 30575 1726867672.25170: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867672.25278: in run() - task 0affcac9-a3a5-e081-a588-000000002381 30575 1726867672.25291: variable 'ansible_search_path' from source: unknown 30575 1726867672.25295: variable 'ansible_search_path' from source: unknown 30575 1726867672.25323: calling self._execute() 30575 1726867672.25395: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.25399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.25407: variable 'omit' from source: magic vars 30575 1726867672.25669: variable 'ansible_distribution_major_version' from source: facts 30575 1726867672.25680: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867672.25792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867672.25980: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867672.26014: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867672.26039: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867672.26064: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867672.26130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867672.26148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867672.26166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867672.26186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867672.26258: variable '__network_is_ostree' from source: set_fact 30575 1726867672.26264: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867672.26266: when evaluation is False, skipping this task 30575 1726867672.26269: _execute() done 30575 1726867672.26273: dumping result to json 30575 1726867672.26276: done dumping result, returning 30575 1726867672.26286: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-000000002381] 30575 1726867672.26291: sending task result for task 0affcac9-a3a5-e081-a588-000000002381 30575 1726867672.26371: done sending task result for task 0affcac9-a3a5-e081-a588-000000002381 30575 1726867672.26374: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867672.26458: no more pending results, returning what we have 30575 1726867672.26462: results queue empty 30575 1726867672.26463: checking for any_errors_fatal 30575 1726867672.26468: done checking for any_errors_fatal 30575 1726867672.26469: checking for max_fail_percentage 30575 1726867672.26470: done checking for max_fail_percentage 30575 1726867672.26471: checking to see if all hosts have failed and the running result is not ok 30575 1726867672.26472: done checking to see if all hosts have failed 30575 1726867672.26472: getting the remaining hosts for this loop 30575 1726867672.26474: done getting the remaining hosts for this loop 30575 1726867672.26478: getting the next task for host managed_node3 30575 1726867672.26486: done getting next task for host managed_node3 30575 1726867672.26489: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867672.26494: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867672.26511: getting variables 30575 1726867672.26512: in VariableManager get_vars() 30575 1726867672.26548: Calling all_inventory to load vars for managed_node3 30575 1726867672.26550: Calling groups_inventory to load vars for managed_node3 30575 1726867672.26551: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.26557: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.26558: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.26560: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.27428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.28266: done with get_vars() 30575 1726867672.28282: done getting variables 30575 1726867672.28321: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:52 -0400 (0:00:00.035) 0:01:47.661 ****** 30575 1726867672.28345: entering _queue_task() for managed_node3/set_fact 30575 1726867672.28541: worker is 1 (out of 1 available) 30575 1726867672.28553: exiting _queue_task() for managed_node3/set_fact 30575 1726867672.28566: done queuing things up, now waiting for results queue to drain 30575 1726867672.28568: waiting for pending results... 30575 1726867672.28728: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867672.28826: in run() - task 0affcac9-a3a5-e081-a588-000000002382 30575 1726867672.28837: variable 'ansible_search_path' from source: unknown 30575 1726867672.28841: variable 'ansible_search_path' from source: unknown 30575 1726867672.28867: calling self._execute() 30575 1726867672.28934: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.28940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.28948: variable 'omit' from source: magic vars 30575 1726867672.29199: variable 'ansible_distribution_major_version' from source: facts 30575 1726867672.29207: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867672.29318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867672.29503: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867672.29536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867672.29562: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867672.29587: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867672.29645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867672.29665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867672.29690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867672.29707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867672.29774: variable '__network_is_ostree' from source: set_fact 30575 1726867672.29780: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867672.29783: when evaluation is False, skipping this task 30575 1726867672.29785: _execute() done 30575 1726867672.29788: dumping result to json 30575 1726867672.29790: done dumping result, returning 30575 1726867672.29798: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-000000002382] 30575 1726867672.29802: sending task result for task 0affcac9-a3a5-e081-a588-000000002382 30575 1726867672.29879: done sending task result for task 0affcac9-a3a5-e081-a588-000000002382 30575 1726867672.29882: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867672.29929: no more pending results, returning what we have 30575 1726867672.29932: results queue empty 30575 1726867672.29933: checking for any_errors_fatal 30575 1726867672.29938: done checking for any_errors_fatal 30575 1726867672.29939: checking for max_fail_percentage 30575 1726867672.29940: done checking for max_fail_percentage 30575 1726867672.29941: checking to see if all hosts have failed and the running result is not ok 30575 1726867672.29942: done checking to see if all hosts have failed 30575 1726867672.29943: getting the remaining hosts for this loop 30575 1726867672.29944: done getting the remaining hosts for this loop 30575 1726867672.29947: getting the next task for host managed_node3 30575 1726867672.29957: done getting next task for host managed_node3 30575 1726867672.29960: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867672.29965: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867672.29985: getting variables 30575 1726867672.29987: in VariableManager get_vars() 30575 1726867672.30024: Calling all_inventory to load vars for managed_node3 30575 1726867672.30026: Calling groups_inventory to load vars for managed_node3 30575 1726867672.30028: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867672.30035: Calling all_plugins_play to load vars for managed_node3 30575 1726867672.30038: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867672.30040: Calling groups_plugins_play to load vars for managed_node3 30575 1726867672.30766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867672.31653: done with get_vars() 30575 1726867672.31668: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:52 -0400 (0:00:00.033) 0:01:47.694 ****** 30575 1726867672.31733: entering _queue_task() for managed_node3/service_facts 30575 1726867672.31927: worker is 1 (out of 1 available) 30575 1726867672.31941: exiting _queue_task() for managed_node3/service_facts 30575 1726867672.31954: done queuing things up, now waiting for results queue to drain 30575 1726867672.31956: waiting for pending results... 30575 1726867672.32115: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867672.32203: in run() - task 0affcac9-a3a5-e081-a588-000000002384 30575 1726867672.32215: variable 'ansible_search_path' from source: unknown 30575 1726867672.32221: variable 'ansible_search_path' from source: unknown 30575 1726867672.32243: calling self._execute() 30575 1726867672.32318: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.32323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.32329: variable 'omit' from source: magic vars 30575 1726867672.32576: variable 'ansible_distribution_major_version' from source: facts 30575 1726867672.32586: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867672.32592: variable 'omit' from source: magic vars 30575 1726867672.32644: variable 'omit' from source: magic vars 30575 1726867672.32665: variable 'omit' from source: magic vars 30575 1726867672.32694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867672.32722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867672.32734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867672.32747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867672.32757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867672.32780: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867672.32783: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.32788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.32854: Set connection var ansible_pipelining to False 30575 1726867672.32857: Set connection var ansible_shell_type to sh 30575 1726867672.32862: Set connection var ansible_shell_executable to /bin/sh 30575 1726867672.32867: Set connection var ansible_timeout to 10 30575 1726867672.32872: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867672.32880: Set connection var ansible_connection to ssh 30575 1726867672.32897: variable 'ansible_shell_executable' from source: unknown 30575 1726867672.32900: variable 'ansible_connection' from source: unknown 30575 1726867672.32903: variable 'ansible_module_compression' from source: unknown 30575 1726867672.32905: variable 'ansible_shell_type' from source: unknown 30575 1726867672.32907: variable 'ansible_shell_executable' from source: unknown 30575 1726867672.32909: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867672.32913: variable 'ansible_pipelining' from source: unknown 30575 1726867672.32918: variable 'ansible_timeout' from source: unknown 30575 1726867672.32920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867672.33051: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867672.33056: variable 'omit' from source: magic vars 30575 1726867672.33064: starting attempt loop 30575 1726867672.33067: running the handler 30575 1726867672.33079: _low_level_execute_command(): starting 30575 1726867672.33086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867672.33566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867672.33570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.33573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867672.33575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.33631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867672.33634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867672.33690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867672.35371: stdout chunk (state=3): >>>/root <<< 30575 1726867672.35469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867672.35496: stderr chunk (state=3): >>><<< 30575 1726867672.35499: stdout chunk (state=3): >>><<< 30575 1726867672.35515: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867672.35528: _low_level_execute_command(): starting 30575 1726867672.35537: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772 `" && echo ansible-tmp-1726867672.3551476-35538-69198079653772="` echo /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772 `" ) && sleep 0' 30575 1726867672.35951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867672.35955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867672.35957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.35967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867672.35970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.36016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867672.36020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867672.36025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867672.36069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867672.37958: stdout chunk (state=3): >>>ansible-tmp-1726867672.3551476-35538-69198079653772=/root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772 <<< 30575 1726867672.38063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867672.38086: stderr chunk (state=3): >>><<< 30575 1726867672.38089: stdout chunk (state=3): >>><<< 30575 1726867672.38099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867672.3551476-35538-69198079653772=/root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867672.38137: variable 'ansible_module_compression' from source: unknown 30575 1726867672.38168: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867672.38200: variable 'ansible_facts' from source: unknown 30575 1726867672.38259: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/AnsiballZ_service_facts.py 30575 1726867672.38351: Sending initial data 30575 1726867672.38355: Sent initial data (161 bytes) 30575 1726867672.38794: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867672.38797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.38800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867672.38802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867672.38803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.38847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867672.38850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867672.38900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867672.40435: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867672.40473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867672.40524: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpyskw6a9r /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/AnsiballZ_service_facts.py <<< 30575 1726867672.40526: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/AnsiballZ_service_facts.py" <<< 30575 1726867672.40566: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpyskw6a9r" to remote "/root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/AnsiballZ_service_facts.py" <<< 30575 1726867672.41135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867672.41167: stderr chunk (state=3): >>><<< 30575 1726867672.41170: stdout chunk (state=3): >>><<< 30575 1726867672.41230: done transferring module to remote 30575 1726867672.41237: _low_level_execute_command(): starting 30575 1726867672.41240: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/ /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/AnsiballZ_service_facts.py && sleep 0' 30575 1726867672.41799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30575 1726867672.41815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867672.41835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867672.41852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867672.41870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867672.41884: stderr chunk (state=3): >>>debug2: match not found <<< 30575 1726867672.41899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.41925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867672.41994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.42030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867672.42046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867672.42060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867672.42169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867672.43905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867672.43928: stderr chunk (state=3): >>><<< 30575 1726867672.43932: stdout chunk (state=3): >>><<< 30575 1726867672.43945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867672.43948: _low_level_execute_command(): starting 30575 1726867672.43952: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/AnsiballZ_service_facts.py && sleep 0' 30575 1726867672.44340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867672.44346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.44362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867672.44411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867672.44414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867672.44472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867673.94610: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30575 1726867673.94640: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30575 1726867673.94645: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30575 1726867673.94663: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30575 1726867673.94683: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867673.96172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867673.96204: stderr chunk (state=3): >>><<< 30575 1726867673.96207: stdout chunk (state=3): >>><<< 30575 1726867673.96238: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867673.96956: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867673.96964: _low_level_execute_command(): starting 30575 1726867673.96969: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867672.3551476-35538-69198079653772/ > /dev/null 2>&1 && sleep 0' 30575 1726867673.97429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867673.97432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867673.97434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867673.97437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867673.97438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867673.97440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867673.97483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867673.97502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867673.97547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867673.99346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867673.99371: stderr chunk (state=3): >>><<< 30575 1726867673.99374: stdout chunk (state=3): >>><<< 30575 1726867673.99389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867673.99395: handler run complete 30575 1726867673.99505: variable 'ansible_facts' from source: unknown 30575 1726867673.99602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867673.99882: variable 'ansible_facts' from source: unknown 30575 1726867673.99967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.00081: attempt loop complete, returning result 30575 1726867674.00087: _execute() done 30575 1726867674.00089: dumping result to json 30575 1726867674.00127: done dumping result, returning 30575 1726867674.00135: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000002384] 30575 1726867674.00139: sending task result for task 0affcac9-a3a5-e081-a588-000000002384 30575 1726867674.00925: done sending task result for task 0affcac9-a3a5-e081-a588-000000002384 30575 1726867674.00928: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867674.00982: no more pending results, returning what we have 30575 1726867674.00984: results queue empty 30575 1726867674.00984: checking for any_errors_fatal 30575 1726867674.00986: done checking for any_errors_fatal 30575 1726867674.00987: checking for max_fail_percentage 30575 1726867674.00988: done checking for max_fail_percentage 30575 1726867674.00989: checking to see if all hosts have failed and the running result is not ok 30575 1726867674.00989: done checking to see if all hosts have failed 30575 1726867674.00990: getting the remaining hosts for this loop 30575 1726867674.00990: done getting the remaining hosts for this loop 30575 1726867674.00993: getting the next task for host managed_node3 30575 1726867674.00997: done getting next task for host managed_node3 30575 1726867674.00999: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867674.01004: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867674.01014: getting variables 30575 1726867674.01015: in VariableManager get_vars() 30575 1726867674.01040: Calling all_inventory to load vars for managed_node3 30575 1726867674.01042: Calling groups_inventory to load vars for managed_node3 30575 1726867674.01044: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867674.01050: Calling all_plugins_play to load vars for managed_node3 30575 1726867674.01052: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867674.01057: Calling groups_plugins_play to load vars for managed_node3 30575 1726867674.01740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.02603: done with get_vars() 30575 1726867674.02620: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:54 -0400 (0:00:01.709) 0:01:49.404 ****** 30575 1726867674.02689: entering _queue_task() for managed_node3/package_facts 30575 1726867674.02921: worker is 1 (out of 1 available) 30575 1726867674.02935: exiting _queue_task() for managed_node3/package_facts 30575 1726867674.02949: done queuing things up, now waiting for results queue to drain 30575 1726867674.02951: waiting for pending results... 30575 1726867674.03136: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867674.03237: in run() - task 0affcac9-a3a5-e081-a588-000000002385 30575 1726867674.03249: variable 'ansible_search_path' from source: unknown 30575 1726867674.03253: variable 'ansible_search_path' from source: unknown 30575 1726867674.03286: calling self._execute() 30575 1726867674.03359: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.03363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.03372: variable 'omit' from source: magic vars 30575 1726867674.03648: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.03657: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867674.03665: variable 'omit' from source: magic vars 30575 1726867674.03722: variable 'omit' from source: magic vars 30575 1726867674.03743: variable 'omit' from source: magic vars 30575 1726867674.03774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867674.03801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867674.03820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867674.03832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867674.03843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867674.03866: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867674.03869: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.03871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.03942: Set connection var ansible_pipelining to False 30575 1726867674.03945: Set connection var ansible_shell_type to sh 30575 1726867674.03951: Set connection var ansible_shell_executable to /bin/sh 30575 1726867674.03955: Set connection var ansible_timeout to 10 30575 1726867674.03961: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867674.03967: Set connection var ansible_connection to ssh 30575 1726867674.03986: variable 'ansible_shell_executable' from source: unknown 30575 1726867674.03989: variable 'ansible_connection' from source: unknown 30575 1726867674.03992: variable 'ansible_module_compression' from source: unknown 30575 1726867674.03994: variable 'ansible_shell_type' from source: unknown 30575 1726867674.03996: variable 'ansible_shell_executable' from source: unknown 30575 1726867674.03998: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.04000: variable 'ansible_pipelining' from source: unknown 30575 1726867674.04003: variable 'ansible_timeout' from source: unknown 30575 1726867674.04007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.04149: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867674.04160: variable 'omit' from source: magic vars 30575 1726867674.04163: starting attempt loop 30575 1726867674.04165: running the handler 30575 1726867674.04178: _low_level_execute_command(): starting 30575 1726867674.04185: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867674.04691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867674.04695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.04700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867674.04702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.04754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867674.04757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867674.04764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867674.04810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867674.06414: stdout chunk (state=3): >>>/root <<< 30575 1726867674.06514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867674.06542: stderr chunk (state=3): >>><<< 30575 1726867674.06546: stdout chunk (state=3): >>><<< 30575 1726867674.06563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867674.06573: _low_level_execute_command(): starting 30575 1726867674.06579: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660 `" && echo ansible-tmp-1726867674.0656192-35560-231191077087660="` echo /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660 `" ) && sleep 0' 30575 1726867674.06969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867674.06975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.07001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.07041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867674.07045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867674.07095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867674.08963: stdout chunk (state=3): >>>ansible-tmp-1726867674.0656192-35560-231191077087660=/root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660 <<< 30575 1726867674.09071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867674.09093: stderr chunk (state=3): >>><<< 30575 1726867674.09097: stdout chunk (state=3): >>><<< 30575 1726867674.09108: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867674.0656192-35560-231191077087660=/root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867674.09145: variable 'ansible_module_compression' from source: unknown 30575 1726867674.09179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867674.09229: variable 'ansible_facts' from source: unknown 30575 1726867674.09346: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/AnsiballZ_package_facts.py 30575 1726867674.09441: Sending initial data 30575 1726867674.09445: Sent initial data (162 bytes) 30575 1726867674.09857: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867674.09860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867674.09863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.09865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867674.09867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.09923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867674.09925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867674.09964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867674.11483: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867674.11489: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867674.11526: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867674.11566: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpud9e9rtg /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/AnsiballZ_package_facts.py <<< 30575 1726867674.11573: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/AnsiballZ_package_facts.py" <<< 30575 1726867674.11611: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpud9e9rtg" to remote "/root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/AnsiballZ_package_facts.py" <<< 30575 1726867674.11613: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/AnsiballZ_package_facts.py" <<< 30575 1726867674.12629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867674.12659: stderr chunk (state=3): >>><<< 30575 1726867674.12662: stdout chunk (state=3): >>><<< 30575 1726867674.12699: done transferring module to remote 30575 1726867674.12707: _low_level_execute_command(): starting 30575 1726867674.12711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/ /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/AnsiballZ_package_facts.py && sleep 0' 30575 1726867674.13111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867674.13114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.13119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867674.13121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867674.13126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.13171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867674.13176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867674.13218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867674.14949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867674.14969: stderr chunk (state=3): >>><<< 30575 1726867674.14972: stdout chunk (state=3): >>><<< 30575 1726867674.14984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867674.14988: _low_level_execute_command(): starting 30575 1726867674.14990: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/AnsiballZ_package_facts.py && sleep 0' 30575 1726867674.15378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867674.15382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.15384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867674.15386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867674.15388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.15429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867674.15444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867674.15495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867674.59443: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30575 1726867674.59465: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30575 1726867674.59504: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867674.59527: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30575 1726867674.59554: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30575 1726867674.59562: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867674.59582: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30575 1726867674.59588: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30575 1726867674.59620: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867674.59628: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30575 1726867674.59643: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867674.61387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867674.61420: stderr chunk (state=3): >>><<< 30575 1726867674.61423: stdout chunk (state=3): >>><<< 30575 1726867674.61466: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867674.62809: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867674.62827: _low_level_execute_command(): starting 30575 1726867674.62831: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867674.0656192-35560-231191077087660/ > /dev/null 2>&1 && sleep 0' 30575 1726867674.63270: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867674.63274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.63290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867674.63344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867674.63347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867674.63349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867674.63400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867674.65233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867674.65257: stderr chunk (state=3): >>><<< 30575 1726867674.65260: stdout chunk (state=3): >>><<< 30575 1726867674.65272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867674.65279: handler run complete 30575 1726867674.65741: variable 'ansible_facts' from source: unknown 30575 1726867674.66040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.67075: variable 'ansible_facts' from source: unknown 30575 1726867674.67321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.67695: attempt loop complete, returning result 30575 1726867674.67704: _execute() done 30575 1726867674.67707: dumping result to json 30575 1726867674.67823: done dumping result, returning 30575 1726867674.67831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000002385] 30575 1726867674.67835: sending task result for task 0affcac9-a3a5-e081-a588-000000002385 30575 1726867674.69253: done sending task result for task 0affcac9-a3a5-e081-a588-000000002385 30575 1726867674.69256: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867674.69348: no more pending results, returning what we have 30575 1726867674.69350: results queue empty 30575 1726867674.69351: checking for any_errors_fatal 30575 1726867674.69354: done checking for any_errors_fatal 30575 1726867674.69354: checking for max_fail_percentage 30575 1726867674.69355: done checking for max_fail_percentage 30575 1726867674.69356: checking to see if all hosts have failed and the running result is not ok 30575 1726867674.69356: done checking to see if all hosts have failed 30575 1726867674.69357: getting the remaining hosts for this loop 30575 1726867674.69358: done getting the remaining hosts for this loop 30575 1726867674.69360: getting the next task for host managed_node3 30575 1726867674.69365: done getting next task for host managed_node3 30575 1726867674.69367: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867674.69371: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867674.69380: getting variables 30575 1726867674.69381: in VariableManager get_vars() 30575 1726867674.69405: Calling all_inventory to load vars for managed_node3 30575 1726867674.69407: Calling groups_inventory to load vars for managed_node3 30575 1726867674.69409: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867674.69415: Calling all_plugins_play to load vars for managed_node3 30575 1726867674.69417: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867674.69419: Calling groups_plugins_play to load vars for managed_node3 30575 1726867674.70093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.70945: done with get_vars() 30575 1726867674.70963: done getting variables 30575 1726867674.71006: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:54 -0400 (0:00:00.683) 0:01:50.087 ****** 30575 1726867674.71034: entering _queue_task() for managed_node3/debug 30575 1726867674.71265: worker is 1 (out of 1 available) 30575 1726867674.71280: exiting _queue_task() for managed_node3/debug 30575 1726867674.71293: done queuing things up, now waiting for results queue to drain 30575 1726867674.71295: waiting for pending results... 30575 1726867674.71485: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867674.71562: in run() - task 0affcac9-a3a5-e081-a588-000000002329 30575 1726867674.71574: variable 'ansible_search_path' from source: unknown 30575 1726867674.71580: variable 'ansible_search_path' from source: unknown 30575 1726867674.71607: calling self._execute() 30575 1726867674.71686: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.71690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.71699: variable 'omit' from source: magic vars 30575 1726867674.71980: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.71989: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867674.71996: variable 'omit' from source: magic vars 30575 1726867674.72042: variable 'omit' from source: magic vars 30575 1726867674.72110: variable 'network_provider' from source: set_fact 30575 1726867674.72128: variable 'omit' from source: magic vars 30575 1726867674.72159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867674.72189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867674.72204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867674.72217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867674.72230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867674.72252: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867674.72255: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.72260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.72329: Set connection var ansible_pipelining to False 30575 1726867674.72333: Set connection var ansible_shell_type to sh 30575 1726867674.72336: Set connection var ansible_shell_executable to /bin/sh 30575 1726867674.72342: Set connection var ansible_timeout to 10 30575 1726867674.72347: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867674.72353: Set connection var ansible_connection to ssh 30575 1726867674.72370: variable 'ansible_shell_executable' from source: unknown 30575 1726867674.72373: variable 'ansible_connection' from source: unknown 30575 1726867674.72376: variable 'ansible_module_compression' from source: unknown 30575 1726867674.72380: variable 'ansible_shell_type' from source: unknown 30575 1726867674.72383: variable 'ansible_shell_executable' from source: unknown 30575 1726867674.72388: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.72390: variable 'ansible_pipelining' from source: unknown 30575 1726867674.72393: variable 'ansible_timeout' from source: unknown 30575 1726867674.72399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.72499: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867674.72511: variable 'omit' from source: magic vars 30575 1726867674.72514: starting attempt loop 30575 1726867674.72517: running the handler 30575 1726867674.72555: handler run complete 30575 1726867674.72565: attempt loop complete, returning result 30575 1726867674.72568: _execute() done 30575 1726867674.72571: dumping result to json 30575 1726867674.72573: done dumping result, returning 30575 1726867674.72586: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000002329] 30575 1726867674.72589: sending task result for task 0affcac9-a3a5-e081-a588-000000002329 ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867674.72734: no more pending results, returning what we have 30575 1726867674.72737: results queue empty 30575 1726867674.72738: checking for any_errors_fatal 30575 1726867674.72746: done checking for any_errors_fatal 30575 1726867674.72746: checking for max_fail_percentage 30575 1726867674.72748: done checking for max_fail_percentage 30575 1726867674.72749: checking to see if all hosts have failed and the running result is not ok 30575 1726867674.72749: done checking to see if all hosts have failed 30575 1726867674.72750: getting the remaining hosts for this loop 30575 1726867674.72752: done getting the remaining hosts for this loop 30575 1726867674.72755: getting the next task for host managed_node3 30575 1726867674.72762: done getting next task for host managed_node3 30575 1726867674.72766: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867674.72771: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867674.72784: getting variables 30575 1726867674.72786: in VariableManager get_vars() 30575 1726867674.72823: Calling all_inventory to load vars for managed_node3 30575 1726867674.72826: Calling groups_inventory to load vars for managed_node3 30575 1726867674.72828: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867674.72836: Calling all_plugins_play to load vars for managed_node3 30575 1726867674.72839: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867674.72841: Calling groups_plugins_play to load vars for managed_node3 30575 1726867674.73679: done sending task result for task 0affcac9-a3a5-e081-a588-000000002329 30575 1726867674.73682: WORKER PROCESS EXITING 30575 1726867674.73693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.74557: done with get_vars() 30575 1726867674.74572: done getting variables 30575 1726867674.74613: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:54 -0400 (0:00:00.036) 0:01:50.123 ****** 30575 1726867674.74643: entering _queue_task() for managed_node3/fail 30575 1726867674.74843: worker is 1 (out of 1 available) 30575 1726867674.74856: exiting _queue_task() for managed_node3/fail 30575 1726867674.74870: done queuing things up, now waiting for results queue to drain 30575 1726867674.74872: waiting for pending results... 30575 1726867674.75056: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867674.75142: in run() - task 0affcac9-a3a5-e081-a588-00000000232a 30575 1726867674.75153: variable 'ansible_search_path' from source: unknown 30575 1726867674.75157: variable 'ansible_search_path' from source: unknown 30575 1726867674.75186: calling self._execute() 30575 1726867674.75258: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.75262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.75271: variable 'omit' from source: magic vars 30575 1726867674.75537: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.75542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867674.75623: variable 'network_state' from source: role '' defaults 30575 1726867674.75633: Evaluated conditional (network_state != {}): False 30575 1726867674.75636: when evaluation is False, skipping this task 30575 1726867674.75640: _execute() done 30575 1726867674.75643: dumping result to json 30575 1726867674.75646: done dumping result, returning 30575 1726867674.75654: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-00000000232a] 30575 1726867674.75657: sending task result for task 0affcac9-a3a5-e081-a588-00000000232a 30575 1726867674.75743: done sending task result for task 0affcac9-a3a5-e081-a588-00000000232a 30575 1726867674.75747: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867674.75803: no more pending results, returning what we have 30575 1726867674.75806: results queue empty 30575 1726867674.75807: checking for any_errors_fatal 30575 1726867674.75812: done checking for any_errors_fatal 30575 1726867674.75812: checking for max_fail_percentage 30575 1726867674.75814: done checking for max_fail_percentage 30575 1726867674.75815: checking to see if all hosts have failed and the running result is not ok 30575 1726867674.75815: done checking to see if all hosts have failed 30575 1726867674.75819: getting the remaining hosts for this loop 30575 1726867674.75820: done getting the remaining hosts for this loop 30575 1726867674.75823: getting the next task for host managed_node3 30575 1726867674.75830: done getting next task for host managed_node3 30575 1726867674.75833: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867674.75837: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867674.75857: getting variables 30575 1726867674.75859: in VariableManager get_vars() 30575 1726867674.75900: Calling all_inventory to load vars for managed_node3 30575 1726867674.75902: Calling groups_inventory to load vars for managed_node3 30575 1726867674.75904: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867674.75912: Calling all_plugins_play to load vars for managed_node3 30575 1726867674.75914: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867674.75920: Calling groups_plugins_play to load vars for managed_node3 30575 1726867674.76643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.77500: done with get_vars() 30575 1726867674.77515: done getting variables 30575 1726867674.77555: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:54 -0400 (0:00:00.029) 0:01:50.153 ****** 30575 1726867674.77578: entering _queue_task() for managed_node3/fail 30575 1726867674.77772: worker is 1 (out of 1 available) 30575 1726867674.77788: exiting _queue_task() for managed_node3/fail 30575 1726867674.77801: done queuing things up, now waiting for results queue to drain 30575 1726867674.77803: waiting for pending results... 30575 1726867674.77970: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867674.78063: in run() - task 0affcac9-a3a5-e081-a588-00000000232b 30575 1726867674.78074: variable 'ansible_search_path' from source: unknown 30575 1726867674.78078: variable 'ansible_search_path' from source: unknown 30575 1726867674.78104: calling self._execute() 30575 1726867674.78171: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.78175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.78185: variable 'omit' from source: magic vars 30575 1726867674.78429: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.78437: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867674.78522: variable 'network_state' from source: role '' defaults 30575 1726867674.78529: Evaluated conditional (network_state != {}): False 30575 1726867674.78532: when evaluation is False, skipping this task 30575 1726867674.78535: _execute() done 30575 1726867674.78538: dumping result to json 30575 1726867674.78540: done dumping result, returning 30575 1726867674.78548: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-00000000232b] 30575 1726867674.78553: sending task result for task 0affcac9-a3a5-e081-a588-00000000232b 30575 1726867674.78639: done sending task result for task 0affcac9-a3a5-e081-a588-00000000232b 30575 1726867674.78643: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867674.78719: no more pending results, returning what we have 30575 1726867674.78722: results queue empty 30575 1726867674.78722: checking for any_errors_fatal 30575 1726867674.78729: done checking for any_errors_fatal 30575 1726867674.78729: checking for max_fail_percentage 30575 1726867674.78731: done checking for max_fail_percentage 30575 1726867674.78732: checking to see if all hosts have failed and the running result is not ok 30575 1726867674.78732: done checking to see if all hosts have failed 30575 1726867674.78733: getting the remaining hosts for this loop 30575 1726867674.78734: done getting the remaining hosts for this loop 30575 1726867674.78737: getting the next task for host managed_node3 30575 1726867674.78743: done getting next task for host managed_node3 30575 1726867674.78747: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867674.78751: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867674.78768: getting variables 30575 1726867674.78769: in VariableManager get_vars() 30575 1726867674.78799: Calling all_inventory to load vars for managed_node3 30575 1726867674.78800: Calling groups_inventory to load vars for managed_node3 30575 1726867674.78802: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867674.78807: Calling all_plugins_play to load vars for managed_node3 30575 1726867674.78809: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867674.78811: Calling groups_plugins_play to load vars for managed_node3 30575 1726867674.83438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.84280: done with get_vars() 30575 1726867674.84296: done getting variables 30575 1726867674.84333: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:54 -0400 (0:00:00.067) 0:01:50.221 ****** 30575 1726867674.84353: entering _queue_task() for managed_node3/fail 30575 1726867674.84628: worker is 1 (out of 1 available) 30575 1726867674.84641: exiting _queue_task() for managed_node3/fail 30575 1726867674.84654: done queuing things up, now waiting for results queue to drain 30575 1726867674.84656: waiting for pending results... 30575 1726867674.84845: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867674.84947: in run() - task 0affcac9-a3a5-e081-a588-00000000232c 30575 1726867674.84958: variable 'ansible_search_path' from source: unknown 30575 1726867674.84964: variable 'ansible_search_path' from source: unknown 30575 1726867674.84994: calling self._execute() 30575 1726867674.85072: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.85079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.85091: variable 'omit' from source: magic vars 30575 1726867674.85373: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.85384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867674.85504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867674.87036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867674.87093: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867674.87121: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867674.87145: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867674.87166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867674.87231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.87252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.87276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.87302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.87314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.87396: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.87399: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867674.87474: variable 'ansible_distribution' from source: facts 30575 1726867674.87479: variable '__network_rh_distros' from source: role '' defaults 30575 1726867674.87487: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867674.87644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.87661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.87679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.87704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.87720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.87753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.87769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.87787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.87810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.87826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.87854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.87870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.87888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.87911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.87924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.88118: variable 'network_connections' from source: include params 30575 1726867674.88129: variable 'interface' from source: play vars 30575 1726867674.88175: variable 'interface' from source: play vars 30575 1726867674.88185: variable 'network_state' from source: role '' defaults 30575 1726867674.88232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867674.88348: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867674.88378: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867674.88400: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867674.88424: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867674.88454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867674.88472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867674.88495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.88514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867674.88534: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867674.88538: when evaluation is False, skipping this task 30575 1726867674.88540: _execute() done 30575 1726867674.88542: dumping result to json 30575 1726867674.88545: done dumping result, returning 30575 1726867674.88553: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-00000000232c] 30575 1726867674.88558: sending task result for task 0affcac9-a3a5-e081-a588-00000000232c 30575 1726867674.88639: done sending task result for task 0affcac9-a3a5-e081-a588-00000000232c 30575 1726867674.88641: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867674.88686: no more pending results, returning what we have 30575 1726867674.88689: results queue empty 30575 1726867674.88690: checking for any_errors_fatal 30575 1726867674.88699: done checking for any_errors_fatal 30575 1726867674.88700: checking for max_fail_percentage 30575 1726867674.88701: done checking for max_fail_percentage 30575 1726867674.88702: checking to see if all hosts have failed and the running result is not ok 30575 1726867674.88703: done checking to see if all hosts have failed 30575 1726867674.88704: getting the remaining hosts for this loop 30575 1726867674.88705: done getting the remaining hosts for this loop 30575 1726867674.88709: getting the next task for host managed_node3 30575 1726867674.88718: done getting next task for host managed_node3 30575 1726867674.88722: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867674.88726: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867674.88753: getting variables 30575 1726867674.88755: in VariableManager get_vars() 30575 1726867674.88801: Calling all_inventory to load vars for managed_node3 30575 1726867674.88804: Calling groups_inventory to load vars for managed_node3 30575 1726867674.88806: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867674.88815: Calling all_plugins_play to load vars for managed_node3 30575 1726867674.88817: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867674.88820: Calling groups_plugins_play to load vars for managed_node3 30575 1726867674.89633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.90618: done with get_vars() 30575 1726867674.90634: done getting variables 30575 1726867674.90674: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:54 -0400 (0:00:00.063) 0:01:50.284 ****** 30575 1726867674.90700: entering _queue_task() for managed_node3/dnf 30575 1726867674.90938: worker is 1 (out of 1 available) 30575 1726867674.90954: exiting _queue_task() for managed_node3/dnf 30575 1726867674.90967: done queuing things up, now waiting for results queue to drain 30575 1726867674.90969: waiting for pending results... 30575 1726867674.91157: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867674.91268: in run() - task 0affcac9-a3a5-e081-a588-00000000232d 30575 1726867674.91282: variable 'ansible_search_path' from source: unknown 30575 1726867674.91286: variable 'ansible_search_path' from source: unknown 30575 1726867674.91316: calling self._execute() 30575 1726867674.91396: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.91401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.91413: variable 'omit' from source: magic vars 30575 1726867674.91683: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.91693: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867674.91829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867674.93361: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867674.93416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867674.93444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867674.93469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867674.93494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867674.93552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.93574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.93595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.93624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.93636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.93719: variable 'ansible_distribution' from source: facts 30575 1726867674.93725: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.93738: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867674.93813: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867674.93896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.93915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.93934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.93958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.93969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.93999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.94019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.94037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.94062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.94072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.94100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.94116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.94137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.94161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.94171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.94280: variable 'network_connections' from source: include params 30575 1726867674.94289: variable 'interface' from source: play vars 30575 1726867674.94334: variable 'interface' from source: play vars 30575 1726867674.94384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867674.94502: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867674.94532: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867674.94556: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867674.94579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867674.94609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867674.94627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867674.94648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.94666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867674.94701: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867674.94855: variable 'network_connections' from source: include params 30575 1726867674.94859: variable 'interface' from source: play vars 30575 1726867674.94904: variable 'interface' from source: play vars 30575 1726867674.94925: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867674.94929: when evaluation is False, skipping this task 30575 1726867674.94931: _execute() done 30575 1726867674.94934: dumping result to json 30575 1726867674.94936: done dumping result, returning 30575 1726867674.94944: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000232d] 30575 1726867674.94949: sending task result for task 0affcac9-a3a5-e081-a588-00000000232d 30575 1726867674.95031: done sending task result for task 0affcac9-a3a5-e081-a588-00000000232d 30575 1726867674.95034: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867674.95084: no more pending results, returning what we have 30575 1726867674.95088: results queue empty 30575 1726867674.95088: checking for any_errors_fatal 30575 1726867674.95095: done checking for any_errors_fatal 30575 1726867674.95096: checking for max_fail_percentage 30575 1726867674.95097: done checking for max_fail_percentage 30575 1726867674.95098: checking to see if all hosts have failed and the running result is not ok 30575 1726867674.95099: done checking to see if all hosts have failed 30575 1726867674.95100: getting the remaining hosts for this loop 30575 1726867674.95101: done getting the remaining hosts for this loop 30575 1726867674.95105: getting the next task for host managed_node3 30575 1726867674.95114: done getting next task for host managed_node3 30575 1726867674.95118: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867674.95123: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867674.95149: getting variables 30575 1726867674.95151: in VariableManager get_vars() 30575 1726867674.95197: Calling all_inventory to load vars for managed_node3 30575 1726867674.95199: Calling groups_inventory to load vars for managed_node3 30575 1726867674.95202: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867674.95210: Calling all_plugins_play to load vars for managed_node3 30575 1726867674.95212: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867674.95215: Calling groups_plugins_play to load vars for managed_node3 30575 1726867674.96019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867674.96893: done with get_vars() 30575 1726867674.96910: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867674.96960: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:54 -0400 (0:00:00.062) 0:01:50.347 ****** 30575 1726867674.96984: entering _queue_task() for managed_node3/yum 30575 1726867674.97208: worker is 1 (out of 1 available) 30575 1726867674.97222: exiting _queue_task() for managed_node3/yum 30575 1726867674.97234: done queuing things up, now waiting for results queue to drain 30575 1726867674.97236: waiting for pending results... 30575 1726867674.97418: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867674.97516: in run() - task 0affcac9-a3a5-e081-a588-00000000232e 30575 1726867674.97529: variable 'ansible_search_path' from source: unknown 30575 1726867674.97532: variable 'ansible_search_path' from source: unknown 30575 1726867674.97561: calling self._execute() 30575 1726867674.97642: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867674.97645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867674.97654: variable 'omit' from source: magic vars 30575 1726867674.97931: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.97940: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867674.98055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867674.99581: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867674.99638: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867674.99666: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867674.99693: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867674.99713: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867674.99774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867674.99795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867674.99812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867674.99841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867674.99855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867674.99923: variable 'ansible_distribution_major_version' from source: facts 30575 1726867674.99935: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867674.99938: when evaluation is False, skipping this task 30575 1726867674.99941: _execute() done 30575 1726867674.99943: dumping result to json 30575 1726867674.99948: done dumping result, returning 30575 1726867674.99961: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000232e] 30575 1726867674.99963: sending task result for task 0affcac9-a3a5-e081-a588-00000000232e 30575 1726867675.00045: done sending task result for task 0affcac9-a3a5-e081-a588-00000000232e 30575 1726867675.00047: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867675.00112: no more pending results, returning what we have 30575 1726867675.00116: results queue empty 30575 1726867675.00117: checking for any_errors_fatal 30575 1726867675.00124: done checking for any_errors_fatal 30575 1726867675.00124: checking for max_fail_percentage 30575 1726867675.00127: done checking for max_fail_percentage 30575 1726867675.00127: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.00128: done checking to see if all hosts have failed 30575 1726867675.00129: getting the remaining hosts for this loop 30575 1726867675.00130: done getting the remaining hosts for this loop 30575 1726867675.00134: getting the next task for host managed_node3 30575 1726867675.00142: done getting next task for host managed_node3 30575 1726867675.00146: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867675.00150: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.00175: getting variables 30575 1726867675.00178: in VariableManager get_vars() 30575 1726867675.00218: Calling all_inventory to load vars for managed_node3 30575 1726867675.00220: Calling groups_inventory to load vars for managed_node3 30575 1726867675.00222: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.00231: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.00233: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.00235: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.01199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.02044: done with get_vars() 30575 1726867675.02059: done getting variables 30575 1726867675.02101: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:55 -0400 (0:00:00.051) 0:01:50.398 ****** 30575 1726867675.02126: entering _queue_task() for managed_node3/fail 30575 1726867675.02346: worker is 1 (out of 1 available) 30575 1726867675.02360: exiting _queue_task() for managed_node3/fail 30575 1726867675.02373: done queuing things up, now waiting for results queue to drain 30575 1726867675.02375: waiting for pending results... 30575 1726867675.02563: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867675.02671: in run() - task 0affcac9-a3a5-e081-a588-00000000232f 30575 1726867675.02690: variable 'ansible_search_path' from source: unknown 30575 1726867675.02695: variable 'ansible_search_path' from source: unknown 30575 1726867675.02725: calling self._execute() 30575 1726867675.02805: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.02808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.02824: variable 'omit' from source: magic vars 30575 1726867675.03092: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.03101: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.03188: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867675.03322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867675.04793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867675.04842: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867675.04868: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867675.04896: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867675.04920: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867675.04976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.05002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.05020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.05045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.05056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.05091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.05109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.05127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.05151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.05161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.05190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.05206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.05226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.05250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.05260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.05371: variable 'network_connections' from source: include params 30575 1726867675.05383: variable 'interface' from source: play vars 30575 1726867675.05429: variable 'interface' from source: play vars 30575 1726867675.05476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867675.05585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867675.05623: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867675.05643: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867675.05667: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867675.05698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867675.05714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867675.05733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.05750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867675.05791: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867675.05935: variable 'network_connections' from source: include params 30575 1726867675.05939: variable 'interface' from source: play vars 30575 1726867675.05985: variable 'interface' from source: play vars 30575 1726867675.06000: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867675.06003: when evaluation is False, skipping this task 30575 1726867675.06006: _execute() done 30575 1726867675.06009: dumping result to json 30575 1726867675.06011: done dumping result, returning 30575 1726867675.06021: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000232f] 30575 1726867675.06024: sending task result for task 0affcac9-a3a5-e081-a588-00000000232f 30575 1726867675.06109: done sending task result for task 0affcac9-a3a5-e081-a588-00000000232f 30575 1726867675.06112: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867675.06160: no more pending results, returning what we have 30575 1726867675.06163: results queue empty 30575 1726867675.06164: checking for any_errors_fatal 30575 1726867675.06169: done checking for any_errors_fatal 30575 1726867675.06169: checking for max_fail_percentage 30575 1726867675.06171: done checking for max_fail_percentage 30575 1726867675.06172: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.06173: done checking to see if all hosts have failed 30575 1726867675.06173: getting the remaining hosts for this loop 30575 1726867675.06175: done getting the remaining hosts for this loop 30575 1726867675.06181: getting the next task for host managed_node3 30575 1726867675.06190: done getting next task for host managed_node3 30575 1726867675.06193: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867675.06198: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.06224: getting variables 30575 1726867675.06226: in VariableManager get_vars() 30575 1726867675.06269: Calling all_inventory to load vars for managed_node3 30575 1726867675.06271: Calling groups_inventory to load vars for managed_node3 30575 1726867675.06273: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.06286: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.06289: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.06291: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.07091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.07973: done with get_vars() 30575 1726867675.07991: done getting variables 30575 1726867675.08033: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:55 -0400 (0:00:00.059) 0:01:50.458 ****** 30575 1726867675.08057: entering _queue_task() for managed_node3/package 30575 1726867675.08275: worker is 1 (out of 1 available) 30575 1726867675.08291: exiting _queue_task() for managed_node3/package 30575 1726867675.08303: done queuing things up, now waiting for results queue to drain 30575 1726867675.08304: waiting for pending results... 30575 1726867675.08481: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867675.08582: in run() - task 0affcac9-a3a5-e081-a588-000000002330 30575 1726867675.08597: variable 'ansible_search_path' from source: unknown 30575 1726867675.08601: variable 'ansible_search_path' from source: unknown 30575 1726867675.08632: calling self._execute() 30575 1726867675.08705: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.08708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.08719: variable 'omit' from source: magic vars 30575 1726867675.08975: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.08987: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.09118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867675.09299: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867675.09333: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867675.09357: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867675.09410: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867675.09493: variable 'network_packages' from source: role '' defaults 30575 1726867675.09565: variable '__network_provider_setup' from source: role '' defaults 30575 1726867675.09573: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867675.09621: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867675.09624: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867675.09669: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867675.09783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867675.11333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867675.11374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867675.11403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867675.11427: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867675.11447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867675.11504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.11526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.11543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.11569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.11580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.11611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.11630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.11646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.11670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.11684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.11816: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867675.11884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.11914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.11934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.11957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.11967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.12031: variable 'ansible_python' from source: facts 30575 1726867675.12043: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867675.12098: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867675.12156: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867675.12239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.12256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.12273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.12298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.12309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.12343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.12366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.12384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.12409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.12421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.12520: variable 'network_connections' from source: include params 30575 1726867675.12523: variable 'interface' from source: play vars 30575 1726867675.12592: variable 'interface' from source: play vars 30575 1726867675.12639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867675.12659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867675.12681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.12702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867675.12737: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867675.12910: variable 'network_connections' from source: include params 30575 1726867675.12913: variable 'interface' from source: play vars 30575 1726867675.12980: variable 'interface' from source: play vars 30575 1726867675.13003: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867675.13055: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867675.13243: variable 'network_connections' from source: include params 30575 1726867675.13246: variable 'interface' from source: play vars 30575 1726867675.13292: variable 'interface' from source: play vars 30575 1726867675.13308: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867675.13362: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867675.13551: variable 'network_connections' from source: include params 30575 1726867675.13554: variable 'interface' from source: play vars 30575 1726867675.13600: variable 'interface' from source: play vars 30575 1726867675.13635: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867675.13680: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867675.13685: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867675.13727: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867675.13857: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867675.14145: variable 'network_connections' from source: include params 30575 1726867675.14149: variable 'interface' from source: play vars 30575 1726867675.14194: variable 'interface' from source: play vars 30575 1726867675.14197: variable 'ansible_distribution' from source: facts 30575 1726867675.14200: variable '__network_rh_distros' from source: role '' defaults 30575 1726867675.14205: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.14215: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867675.14320: variable 'ansible_distribution' from source: facts 30575 1726867675.14323: variable '__network_rh_distros' from source: role '' defaults 30575 1726867675.14326: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.14336: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867675.14439: variable 'ansible_distribution' from source: facts 30575 1726867675.14442: variable '__network_rh_distros' from source: role '' defaults 30575 1726867675.14446: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.14472: variable 'network_provider' from source: set_fact 30575 1726867675.14484: variable 'ansible_facts' from source: unknown 30575 1726867675.14819: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867675.14822: when evaluation is False, skipping this task 30575 1726867675.14825: _execute() done 30575 1726867675.14828: dumping result to json 30575 1726867675.14830: done dumping result, returning 30575 1726867675.14837: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-000000002330] 30575 1726867675.14844: sending task result for task 0affcac9-a3a5-e081-a588-000000002330 30575 1726867675.14927: done sending task result for task 0affcac9-a3a5-e081-a588-000000002330 30575 1726867675.14930: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867675.14996: no more pending results, returning what we have 30575 1726867675.14999: results queue empty 30575 1726867675.14999: checking for any_errors_fatal 30575 1726867675.15006: done checking for any_errors_fatal 30575 1726867675.15006: checking for max_fail_percentage 30575 1726867675.15008: done checking for max_fail_percentage 30575 1726867675.15009: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.15010: done checking to see if all hosts have failed 30575 1726867675.15010: getting the remaining hosts for this loop 30575 1726867675.15012: done getting the remaining hosts for this loop 30575 1726867675.15015: getting the next task for host managed_node3 30575 1726867675.15025: done getting next task for host managed_node3 30575 1726867675.15029: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867675.15033: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.15057: getting variables 30575 1726867675.15059: in VariableManager get_vars() 30575 1726867675.15103: Calling all_inventory to load vars for managed_node3 30575 1726867675.15105: Calling groups_inventory to load vars for managed_node3 30575 1726867675.15108: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.15118: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.15121: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.15123: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.16058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.16906: done with get_vars() 30575 1726867675.16925: done getting variables 30575 1726867675.16966: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:55 -0400 (0:00:00.089) 0:01:50.547 ****** 30575 1726867675.16990: entering _queue_task() for managed_node3/package 30575 1726867675.17213: worker is 1 (out of 1 available) 30575 1726867675.17230: exiting _queue_task() for managed_node3/package 30575 1726867675.17243: done queuing things up, now waiting for results queue to drain 30575 1726867675.17244: waiting for pending results... 30575 1726867675.17421: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867675.17510: in run() - task 0affcac9-a3a5-e081-a588-000000002331 30575 1726867675.17521: variable 'ansible_search_path' from source: unknown 30575 1726867675.17525: variable 'ansible_search_path' from source: unknown 30575 1726867675.17554: calling self._execute() 30575 1726867675.17627: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.17631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.17639: variable 'omit' from source: magic vars 30575 1726867675.17902: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.17910: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.17992: variable 'network_state' from source: role '' defaults 30575 1726867675.18001: Evaluated conditional (network_state != {}): False 30575 1726867675.18005: when evaluation is False, skipping this task 30575 1726867675.18008: _execute() done 30575 1726867675.18011: dumping result to json 30575 1726867675.18015: done dumping result, returning 30575 1726867675.18020: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000002331] 30575 1726867675.18029: sending task result for task 0affcac9-a3a5-e081-a588-000000002331 30575 1726867675.18111: done sending task result for task 0affcac9-a3a5-e081-a588-000000002331 30575 1726867675.18114: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867675.18172: no more pending results, returning what we have 30575 1726867675.18176: results queue empty 30575 1726867675.18176: checking for any_errors_fatal 30575 1726867675.18186: done checking for any_errors_fatal 30575 1726867675.18187: checking for max_fail_percentage 30575 1726867675.18189: done checking for max_fail_percentage 30575 1726867675.18190: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.18190: done checking to see if all hosts have failed 30575 1726867675.18191: getting the remaining hosts for this loop 30575 1726867675.18192: done getting the remaining hosts for this loop 30575 1726867675.18195: getting the next task for host managed_node3 30575 1726867675.18203: done getting next task for host managed_node3 30575 1726867675.18206: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867675.18211: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.18233: getting variables 30575 1726867675.18234: in VariableManager get_vars() 30575 1726867675.18267: Calling all_inventory to load vars for managed_node3 30575 1726867675.18269: Calling groups_inventory to load vars for managed_node3 30575 1726867675.18271: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.18280: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.18282: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.18284: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.19020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.19899: done with get_vars() 30575 1726867675.19915: done getting variables 30575 1726867675.19957: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:55 -0400 (0:00:00.029) 0:01:50.577 ****** 30575 1726867675.19983: entering _queue_task() for managed_node3/package 30575 1726867675.20189: worker is 1 (out of 1 available) 30575 1726867675.20202: exiting _queue_task() for managed_node3/package 30575 1726867675.20215: done queuing things up, now waiting for results queue to drain 30575 1726867675.20220: waiting for pending results... 30575 1726867675.20392: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867675.20485: in run() - task 0affcac9-a3a5-e081-a588-000000002332 30575 1726867675.20496: variable 'ansible_search_path' from source: unknown 30575 1726867675.20500: variable 'ansible_search_path' from source: unknown 30575 1726867675.20528: calling self._execute() 30575 1726867675.20603: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.20606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.20615: variable 'omit' from source: magic vars 30575 1726867675.20880: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.20890: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.20972: variable 'network_state' from source: role '' defaults 30575 1726867675.20982: Evaluated conditional (network_state != {}): False 30575 1726867675.20985: when evaluation is False, skipping this task 30575 1726867675.20988: _execute() done 30575 1726867675.20992: dumping result to json 30575 1726867675.20994: done dumping result, returning 30575 1726867675.21006: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-000000002332] 30575 1726867675.21009: sending task result for task 0affcac9-a3a5-e081-a588-000000002332 30575 1726867675.21096: done sending task result for task 0affcac9-a3a5-e081-a588-000000002332 30575 1726867675.21098: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867675.21151: no more pending results, returning what we have 30575 1726867675.21155: results queue empty 30575 1726867675.21155: checking for any_errors_fatal 30575 1726867675.21162: done checking for any_errors_fatal 30575 1726867675.21163: checking for max_fail_percentage 30575 1726867675.21164: done checking for max_fail_percentage 30575 1726867675.21165: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.21166: done checking to see if all hosts have failed 30575 1726867675.21167: getting the remaining hosts for this loop 30575 1726867675.21168: done getting the remaining hosts for this loop 30575 1726867675.21171: getting the next task for host managed_node3 30575 1726867675.21181: done getting next task for host managed_node3 30575 1726867675.21184: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867675.21189: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.21209: getting variables 30575 1726867675.21210: in VariableManager get_vars() 30575 1726867675.21250: Calling all_inventory to load vars for managed_node3 30575 1726867675.21253: Calling groups_inventory to load vars for managed_node3 30575 1726867675.21255: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.21263: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.21265: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.21268: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.22137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.22999: done with get_vars() 30575 1726867675.23014: done getting variables 30575 1726867675.23057: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:55 -0400 (0:00:00.030) 0:01:50.608 ****** 30575 1726867675.23083: entering _queue_task() for managed_node3/service 30575 1726867675.23304: worker is 1 (out of 1 available) 30575 1726867675.23320: exiting _queue_task() for managed_node3/service 30575 1726867675.23332: done queuing things up, now waiting for results queue to drain 30575 1726867675.23334: waiting for pending results... 30575 1726867675.23513: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867675.23610: in run() - task 0affcac9-a3a5-e081-a588-000000002333 30575 1726867675.23623: variable 'ansible_search_path' from source: unknown 30575 1726867675.23626: variable 'ansible_search_path' from source: unknown 30575 1726867675.23654: calling self._execute() 30575 1726867675.23729: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.23733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.23742: variable 'omit' from source: magic vars 30575 1726867675.24014: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.24024: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.24111: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867675.24240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867675.25738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867675.25791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867675.25820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867675.25845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867675.25871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867675.25931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.25956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.25974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.26002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.26013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.26046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.26066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.26084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.26109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.26121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.26147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.26164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.26183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.26207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.26220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.26330: variable 'network_connections' from source: include params 30575 1726867675.26340: variable 'interface' from source: play vars 30575 1726867675.26393: variable 'interface' from source: play vars 30575 1726867675.26442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867675.26552: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867675.26589: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867675.26612: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867675.26635: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867675.26666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867675.26683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867675.26701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.26722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867675.26758: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867675.26911: variable 'network_connections' from source: include params 30575 1726867675.26915: variable 'interface' from source: play vars 30575 1726867675.26962: variable 'interface' from source: play vars 30575 1726867675.26981: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867675.26985: when evaluation is False, skipping this task 30575 1726867675.26988: _execute() done 30575 1726867675.26992: dumping result to json 30575 1726867675.26995: done dumping result, returning 30575 1726867675.27003: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000002333] 30575 1726867675.27008: sending task result for task 0affcac9-a3a5-e081-a588-000000002333 30575 1726867675.27096: done sending task result for task 0affcac9-a3a5-e081-a588-000000002333 30575 1726867675.27105: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867675.27151: no more pending results, returning what we have 30575 1726867675.27154: results queue empty 30575 1726867675.27154: checking for any_errors_fatal 30575 1726867675.27162: done checking for any_errors_fatal 30575 1726867675.27162: checking for max_fail_percentage 30575 1726867675.27164: done checking for max_fail_percentage 30575 1726867675.27165: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.27166: done checking to see if all hosts have failed 30575 1726867675.27167: getting the remaining hosts for this loop 30575 1726867675.27168: done getting the remaining hosts for this loop 30575 1726867675.27172: getting the next task for host managed_node3 30575 1726867675.27182: done getting next task for host managed_node3 30575 1726867675.27187: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867675.27191: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.27219: getting variables 30575 1726867675.27221: in VariableManager get_vars() 30575 1726867675.27267: Calling all_inventory to load vars for managed_node3 30575 1726867675.27269: Calling groups_inventory to load vars for managed_node3 30575 1726867675.27271: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.27285: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.27287: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.27290: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.28135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.29144: done with get_vars() 30575 1726867675.29162: done getting variables 30575 1726867675.29206: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:55 -0400 (0:00:00.061) 0:01:50.669 ****** 30575 1726867675.29235: entering _queue_task() for managed_node3/service 30575 1726867675.29505: worker is 1 (out of 1 available) 30575 1726867675.29523: exiting _queue_task() for managed_node3/service 30575 1726867675.29536: done queuing things up, now waiting for results queue to drain 30575 1726867675.29537: waiting for pending results... 30575 1726867675.29734: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867675.29836: in run() - task 0affcac9-a3a5-e081-a588-000000002334 30575 1726867675.29848: variable 'ansible_search_path' from source: unknown 30575 1726867675.29851: variable 'ansible_search_path' from source: unknown 30575 1726867675.29886: calling self._execute() 30575 1726867675.29960: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.29964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.29973: variable 'omit' from source: magic vars 30575 1726867675.30253: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.30262: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.30373: variable 'network_provider' from source: set_fact 30575 1726867675.30378: variable 'network_state' from source: role '' defaults 30575 1726867675.30388: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867675.30394: variable 'omit' from source: magic vars 30575 1726867675.30441: variable 'omit' from source: magic vars 30575 1726867675.30460: variable 'network_service_name' from source: role '' defaults 30575 1726867675.30507: variable 'network_service_name' from source: role '' defaults 30575 1726867675.30582: variable '__network_provider_setup' from source: role '' defaults 30575 1726867675.30586: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867675.30633: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867675.30638: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867675.30684: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867675.30827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867675.32278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867675.32328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867675.32354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867675.32384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867675.32403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867675.32462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.32487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.32504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.32531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.32542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.32572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.32592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.32608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.32635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.32645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.32797: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867675.32868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.32887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.32903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.32932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.32943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.33002: variable 'ansible_python' from source: facts 30575 1726867675.33014: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867675.33070: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867675.33136: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867675.33205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.33245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.33249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.33265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.33276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.33310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.33330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.33348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.33374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.33386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.33473: variable 'network_connections' from source: include params 30575 1726867675.33480: variable 'interface' from source: play vars 30575 1726867675.33530: variable 'interface' from source: play vars 30575 1726867675.33602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867675.33726: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867675.33759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867675.33794: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867675.33881: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867675.33883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867675.33885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867675.33910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.33934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867675.33970: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867675.34153: variable 'network_connections' from source: include params 30575 1726867675.34159: variable 'interface' from source: play vars 30575 1726867675.34212: variable 'interface' from source: play vars 30575 1726867675.34239: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867675.34292: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867675.34478: variable 'network_connections' from source: include params 30575 1726867675.34482: variable 'interface' from source: play vars 30575 1726867675.34533: variable 'interface' from source: play vars 30575 1726867675.34550: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867675.34603: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867675.34785: variable 'network_connections' from source: include params 30575 1726867675.34789: variable 'interface' from source: play vars 30575 1726867675.34839: variable 'interface' from source: play vars 30575 1726867675.34876: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867675.34919: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867675.34927: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867675.34969: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867675.35104: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867675.35410: variable 'network_connections' from source: include params 30575 1726867675.35413: variable 'interface' from source: play vars 30575 1726867675.35459: variable 'interface' from source: play vars 30575 1726867675.35465: variable 'ansible_distribution' from source: facts 30575 1726867675.35467: variable '__network_rh_distros' from source: role '' defaults 30575 1726867675.35474: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.35486: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867675.35596: variable 'ansible_distribution' from source: facts 30575 1726867675.35599: variable '__network_rh_distros' from source: role '' defaults 30575 1726867675.35603: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.35615: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867675.35727: variable 'ansible_distribution' from source: facts 30575 1726867675.35730: variable '__network_rh_distros' from source: role '' defaults 30575 1726867675.35734: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.35760: variable 'network_provider' from source: set_fact 30575 1726867675.35782: variable 'omit' from source: magic vars 30575 1726867675.35798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867675.35817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867675.35834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867675.35847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867675.35856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867675.35881: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867675.35884: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.35887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.35952: Set connection var ansible_pipelining to False 30575 1726867675.35955: Set connection var ansible_shell_type to sh 30575 1726867675.35960: Set connection var ansible_shell_executable to /bin/sh 30575 1726867675.35965: Set connection var ansible_timeout to 10 30575 1726867675.35969: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867675.35978: Set connection var ansible_connection to ssh 30575 1726867675.35996: variable 'ansible_shell_executable' from source: unknown 30575 1726867675.35999: variable 'ansible_connection' from source: unknown 30575 1726867675.36002: variable 'ansible_module_compression' from source: unknown 30575 1726867675.36004: variable 'ansible_shell_type' from source: unknown 30575 1726867675.36006: variable 'ansible_shell_executable' from source: unknown 30575 1726867675.36008: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.36012: variable 'ansible_pipelining' from source: unknown 30575 1726867675.36014: variable 'ansible_timeout' from source: unknown 30575 1726867675.36021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.36089: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867675.36097: variable 'omit' from source: magic vars 30575 1726867675.36103: starting attempt loop 30575 1726867675.36106: running the handler 30575 1726867675.36159: variable 'ansible_facts' from source: unknown 30575 1726867675.36637: _low_level_execute_command(): starting 30575 1726867675.36643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867675.37131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867675.37135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.37138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867675.37140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.37194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867675.37197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867675.37258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867675.38954: stdout chunk (state=3): >>>/root <<< 30575 1726867675.39052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867675.39082: stderr chunk (state=3): >>><<< 30575 1726867675.39086: stdout chunk (state=3): >>><<< 30575 1726867675.39105: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867675.39114: _low_level_execute_command(): starting 30575 1726867675.39123: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277 `" && echo ansible-tmp-1726867675.3910484-35582-131233658931277="` echo /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277 `" ) && sleep 0' 30575 1726867675.39547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867675.39550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.39561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.39619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867675.39624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867675.39625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867675.39670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867675.41541: stdout chunk (state=3): >>>ansible-tmp-1726867675.3910484-35582-131233658931277=/root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277 <<< 30575 1726867675.41649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867675.41672: stderr chunk (state=3): >>><<< 30575 1726867675.41675: stdout chunk (state=3): >>><<< 30575 1726867675.41690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867675.3910484-35582-131233658931277=/root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867675.41713: variable 'ansible_module_compression' from source: unknown 30575 1726867675.41751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867675.41801: variable 'ansible_facts' from source: unknown 30575 1726867675.41936: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/AnsiballZ_systemd.py 30575 1726867675.42032: Sending initial data 30575 1726867675.42035: Sent initial data (156 bytes) 30575 1726867675.42460: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867675.42464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867675.42470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867675.42472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867675.42474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.42521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867675.42526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867675.42569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867675.44102: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867675.44143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867675.44189: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp6bk64o3o /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/AnsiballZ_systemd.py <<< 30575 1726867675.44198: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/AnsiballZ_systemd.py" <<< 30575 1726867675.44236: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp6bk64o3o" to remote "/root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/AnsiballZ_systemd.py" <<< 30575 1726867675.44239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/AnsiballZ_systemd.py" <<< 30575 1726867675.45274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867675.45311: stderr chunk (state=3): >>><<< 30575 1726867675.45315: stdout chunk (state=3): >>><<< 30575 1726867675.45354: done transferring module to remote 30575 1726867675.45362: _low_level_execute_command(): starting 30575 1726867675.45364: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/ /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/AnsiballZ_systemd.py && sleep 0' 30575 1726867675.45774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867675.45778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867675.45781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867675.45783: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867675.45785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.45839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867675.45843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867675.45883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867675.47616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867675.47639: stderr chunk (state=3): >>><<< 30575 1726867675.47643: stdout chunk (state=3): >>><<< 30575 1726867675.47654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867675.47657: _low_level_execute_command(): starting 30575 1726867675.47659: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/AnsiballZ_systemd.py && sleep 0' 30575 1726867675.48058: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867675.48062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.48065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867675.48067: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867675.48069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.48116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867675.48123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867675.48173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867675.77006: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10575872", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312173056", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "2012387000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30575 1726867675.77031: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system<<< 30575 1726867675.77042: stdout chunk (state=3): >>>.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867675.78876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867675.78909: stderr chunk (state=3): >>><<< 30575 1726867675.78912: stdout chunk (state=3): >>><<< 30575 1726867675.78931: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10575872", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312173056", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "2012387000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867675.79057: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867675.79073: _low_level_execute_command(): starting 30575 1726867675.79080: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867675.3910484-35582-131233658931277/ > /dev/null 2>&1 && sleep 0' 30575 1726867675.79541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867675.79544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867675.79547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.79549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867675.79551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867675.79553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867675.79602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867675.79609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867675.79653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867675.81468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867675.81495: stderr chunk (state=3): >>><<< 30575 1726867675.81499: stdout chunk (state=3): >>><<< 30575 1726867675.81509: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867675.81517: handler run complete 30575 1726867675.81557: attempt loop complete, returning result 30575 1726867675.81560: _execute() done 30575 1726867675.81562: dumping result to json 30575 1726867675.81574: done dumping result, returning 30575 1726867675.81585: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-000000002334] 30575 1726867675.81590: sending task result for task 0affcac9-a3a5-e081-a588-000000002334 30575 1726867675.81825: done sending task result for task 0affcac9-a3a5-e081-a588-000000002334 30575 1726867675.81828: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867675.81887: no more pending results, returning what we have 30575 1726867675.81890: results queue empty 30575 1726867675.81891: checking for any_errors_fatal 30575 1726867675.81898: done checking for any_errors_fatal 30575 1726867675.81898: checking for max_fail_percentage 30575 1726867675.81900: done checking for max_fail_percentage 30575 1726867675.81901: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.81901: done checking to see if all hosts have failed 30575 1726867675.81902: getting the remaining hosts for this loop 30575 1726867675.81903: done getting the remaining hosts for this loop 30575 1726867675.81907: getting the next task for host managed_node3 30575 1726867675.81914: done getting next task for host managed_node3 30575 1726867675.81917: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867675.81922: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.81935: getting variables 30575 1726867675.81936: in VariableManager get_vars() 30575 1726867675.81973: Calling all_inventory to load vars for managed_node3 30575 1726867675.81976: Calling groups_inventory to load vars for managed_node3 30575 1726867675.81980: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.81989: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.81991: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.81994: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.82805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.83669: done with get_vars() 30575 1726867675.83687: done getting variables 30575 1726867675.83730: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:55 -0400 (0:00:00.545) 0:01:51.215 ****** 30575 1726867675.83759: entering _queue_task() for managed_node3/service 30575 1726867675.83985: worker is 1 (out of 1 available) 30575 1726867675.83998: exiting _queue_task() for managed_node3/service 30575 1726867675.84011: done queuing things up, now waiting for results queue to drain 30575 1726867675.84012: waiting for pending results... 30575 1726867675.84208: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867675.84309: in run() - task 0affcac9-a3a5-e081-a588-000000002335 30575 1726867675.84323: variable 'ansible_search_path' from source: unknown 30575 1726867675.84328: variable 'ansible_search_path' from source: unknown 30575 1726867675.84357: calling self._execute() 30575 1726867675.84436: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.84440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.84450: variable 'omit' from source: magic vars 30575 1726867675.84735: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.84744: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.84826: variable 'network_provider' from source: set_fact 30575 1726867675.84831: Evaluated conditional (network_provider == "nm"): True 30575 1726867675.84897: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867675.84960: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867675.85075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867675.86765: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867675.86808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867675.86838: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867675.86865: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867675.86886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867675.86945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.86968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.86987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.87014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.87027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.87058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.87076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.87095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.87119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.87132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.87158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.87176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.87193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.87217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.87230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.87325: variable 'network_connections' from source: include params 30575 1726867675.87333: variable 'interface' from source: play vars 30575 1726867675.87379: variable 'interface' from source: play vars 30575 1726867675.87432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867675.87549: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867675.87575: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867675.87600: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867675.87625: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867675.87656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867675.87673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867675.87691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.87709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867675.87749: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867675.87896: variable 'network_connections' from source: include params 30575 1726867675.87899: variable 'interface' from source: play vars 30575 1726867675.87944: variable 'interface' from source: play vars 30575 1726867675.87965: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867675.87968: when evaluation is False, skipping this task 30575 1726867675.87970: _execute() done 30575 1726867675.87973: dumping result to json 30575 1726867675.87979: done dumping result, returning 30575 1726867675.87986: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-000000002335] 30575 1726867675.87995: sending task result for task 0affcac9-a3a5-e081-a588-000000002335 30575 1726867675.88074: done sending task result for task 0affcac9-a3a5-e081-a588-000000002335 30575 1726867675.88078: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867675.88122: no more pending results, returning what we have 30575 1726867675.88125: results queue empty 30575 1726867675.88126: checking for any_errors_fatal 30575 1726867675.88141: done checking for any_errors_fatal 30575 1726867675.88142: checking for max_fail_percentage 30575 1726867675.88144: done checking for max_fail_percentage 30575 1726867675.88145: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.88146: done checking to see if all hosts have failed 30575 1726867675.88146: getting the remaining hosts for this loop 30575 1726867675.88148: done getting the remaining hosts for this loop 30575 1726867675.88151: getting the next task for host managed_node3 30575 1726867675.88159: done getting next task for host managed_node3 30575 1726867675.88162: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867675.88166: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.88191: getting variables 30575 1726867675.88192: in VariableManager get_vars() 30575 1726867675.88231: Calling all_inventory to load vars for managed_node3 30575 1726867675.88233: Calling groups_inventory to load vars for managed_node3 30575 1726867675.88235: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.88244: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.88246: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.88249: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.89137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.89980: done with get_vars() 30575 1726867675.89995: done getting variables 30575 1726867675.90036: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:55 -0400 (0:00:00.062) 0:01:51.278 ****** 30575 1726867675.90058: entering _queue_task() for managed_node3/service 30575 1726867675.90274: worker is 1 (out of 1 available) 30575 1726867675.90289: exiting _queue_task() for managed_node3/service 30575 1726867675.90301: done queuing things up, now waiting for results queue to drain 30575 1726867675.90303: waiting for pending results... 30575 1726867675.90481: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867675.90579: in run() - task 0affcac9-a3a5-e081-a588-000000002336 30575 1726867675.90591: variable 'ansible_search_path' from source: unknown 30575 1726867675.90594: variable 'ansible_search_path' from source: unknown 30575 1726867675.90624: calling self._execute() 30575 1726867675.90698: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.90702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.90710: variable 'omit' from source: magic vars 30575 1726867675.90981: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.90990: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.91068: variable 'network_provider' from source: set_fact 30575 1726867675.91072: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867675.91075: when evaluation is False, skipping this task 30575 1726867675.91079: _execute() done 30575 1726867675.91082: dumping result to json 30575 1726867675.91085: done dumping result, returning 30575 1726867675.91095: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-000000002336] 30575 1726867675.91100: sending task result for task 0affcac9-a3a5-e081-a588-000000002336 30575 1726867675.91184: done sending task result for task 0affcac9-a3a5-e081-a588-000000002336 30575 1726867675.91187: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867675.91238: no more pending results, returning what we have 30575 1726867675.91242: results queue empty 30575 1726867675.91242: checking for any_errors_fatal 30575 1726867675.91248: done checking for any_errors_fatal 30575 1726867675.91249: checking for max_fail_percentage 30575 1726867675.91250: done checking for max_fail_percentage 30575 1726867675.91251: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.91252: done checking to see if all hosts have failed 30575 1726867675.91253: getting the remaining hosts for this loop 30575 1726867675.91254: done getting the remaining hosts for this loop 30575 1726867675.91258: getting the next task for host managed_node3 30575 1726867675.91265: done getting next task for host managed_node3 30575 1726867675.91269: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867675.91273: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.91294: getting variables 30575 1726867675.91296: in VariableManager get_vars() 30575 1726867675.91331: Calling all_inventory to load vars for managed_node3 30575 1726867675.91333: Calling groups_inventory to load vars for managed_node3 30575 1726867675.91335: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.91342: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.91345: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.91347: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.92082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.93045: done with get_vars() 30575 1726867675.93060: done getting variables 30575 1726867675.93102: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:55 -0400 (0:00:00.030) 0:01:51.308 ****** 30575 1726867675.93127: entering _queue_task() for managed_node3/copy 30575 1726867675.93324: worker is 1 (out of 1 available) 30575 1726867675.93338: exiting _queue_task() for managed_node3/copy 30575 1726867675.93350: done queuing things up, now waiting for results queue to drain 30575 1726867675.93351: waiting for pending results... 30575 1726867675.93532: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867675.93625: in run() - task 0affcac9-a3a5-e081-a588-000000002337 30575 1726867675.93635: variable 'ansible_search_path' from source: unknown 30575 1726867675.93638: variable 'ansible_search_path' from source: unknown 30575 1726867675.93665: calling self._execute() 30575 1726867675.93744: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.93747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.93756: variable 'omit' from source: magic vars 30575 1726867675.94029: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.94037: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.94115: variable 'network_provider' from source: set_fact 30575 1726867675.94119: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867675.94127: when evaluation is False, skipping this task 30575 1726867675.94130: _execute() done 30575 1726867675.94132: dumping result to json 30575 1726867675.94135: done dumping result, returning 30575 1726867675.94143: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-000000002337] 30575 1726867675.94148: sending task result for task 0affcac9-a3a5-e081-a588-000000002337 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867675.94278: no more pending results, returning what we have 30575 1726867675.94282: results queue empty 30575 1726867675.94283: checking for any_errors_fatal 30575 1726867675.94288: done checking for any_errors_fatal 30575 1726867675.94289: checking for max_fail_percentage 30575 1726867675.94291: done checking for max_fail_percentage 30575 1726867675.94292: checking to see if all hosts have failed and the running result is not ok 30575 1726867675.94293: done checking to see if all hosts have failed 30575 1726867675.94293: getting the remaining hosts for this loop 30575 1726867675.94294: done getting the remaining hosts for this loop 30575 1726867675.94298: getting the next task for host managed_node3 30575 1726867675.94305: done getting next task for host managed_node3 30575 1726867675.94309: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867675.94313: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867675.94333: getting variables 30575 1726867675.94334: in VariableManager get_vars() 30575 1726867675.94369: Calling all_inventory to load vars for managed_node3 30575 1726867675.94371: Calling groups_inventory to load vars for managed_node3 30575 1726867675.94373: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867675.94382: Calling all_plugins_play to load vars for managed_node3 30575 1726867675.94385: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867675.94387: Calling groups_plugins_play to load vars for managed_node3 30575 1726867675.94922: done sending task result for task 0affcac9-a3a5-e081-a588-000000002337 30575 1726867675.95146: WORKER PROCESS EXITING 30575 1726867675.95157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867675.96018: done with get_vars() 30575 1726867675.96035: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:55 -0400 (0:00:00.029) 0:01:51.338 ****** 30575 1726867675.96092: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867675.96283: worker is 1 (out of 1 available) 30575 1726867675.96297: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867675.96309: done queuing things up, now waiting for results queue to drain 30575 1726867675.96311: waiting for pending results... 30575 1726867675.96497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867675.96585: in run() - task 0affcac9-a3a5-e081-a588-000000002338 30575 1726867675.96598: variable 'ansible_search_path' from source: unknown 30575 1726867675.96602: variable 'ansible_search_path' from source: unknown 30575 1726867675.96630: calling self._execute() 30575 1726867675.96712: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867675.96715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867675.96725: variable 'omit' from source: magic vars 30575 1726867675.96991: variable 'ansible_distribution_major_version' from source: facts 30575 1726867675.97000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867675.97007: variable 'omit' from source: magic vars 30575 1726867675.97050: variable 'omit' from source: magic vars 30575 1726867675.97156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867675.98605: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867675.98649: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867675.98675: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867675.98702: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867675.98723: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867675.98775: variable 'network_provider' from source: set_fact 30575 1726867675.98866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867675.98888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867675.98906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867675.98934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867675.98946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867675.98997: variable 'omit' from source: magic vars 30575 1726867675.99069: variable 'omit' from source: magic vars 30575 1726867675.99138: variable 'network_connections' from source: include params 30575 1726867675.99150: variable 'interface' from source: play vars 30575 1726867675.99193: variable 'interface' from source: play vars 30575 1726867675.99296: variable 'omit' from source: magic vars 30575 1726867675.99303: variable '__lsr_ansible_managed' from source: task vars 30575 1726867675.99345: variable '__lsr_ansible_managed' from source: task vars 30575 1726867675.99471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867675.99607: Loaded config def from plugin (lookup/template) 30575 1726867675.99610: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867675.99631: File lookup term: get_ansible_managed.j2 30575 1726867675.99634: variable 'ansible_search_path' from source: unknown 30575 1726867675.99637: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867675.99648: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867675.99662: variable 'ansible_search_path' from source: unknown 30575 1726867676.03014: variable 'ansible_managed' from source: unknown 30575 1726867676.03089: variable 'omit' from source: magic vars 30575 1726867676.03108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867676.03127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867676.03140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867676.03152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.03162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.03187: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867676.03190: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.03193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.03255: Set connection var ansible_pipelining to False 30575 1726867676.03258: Set connection var ansible_shell_type to sh 30575 1726867676.03263: Set connection var ansible_shell_executable to /bin/sh 30575 1726867676.03272: Set connection var ansible_timeout to 10 30575 1726867676.03276: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867676.03284: Set connection var ansible_connection to ssh 30575 1726867676.03301: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.03304: variable 'ansible_connection' from source: unknown 30575 1726867676.03306: variable 'ansible_module_compression' from source: unknown 30575 1726867676.03309: variable 'ansible_shell_type' from source: unknown 30575 1726867676.03311: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.03313: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.03317: variable 'ansible_pipelining' from source: unknown 30575 1726867676.03323: variable 'ansible_timeout' from source: unknown 30575 1726867676.03327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.03412: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867676.03423: variable 'omit' from source: magic vars 30575 1726867676.03430: starting attempt loop 30575 1726867676.03433: running the handler 30575 1726867676.03444: _low_level_execute_command(): starting 30575 1726867676.03450: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867676.03930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.03934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.03938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.03940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.03990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.03993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.04053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.05725: stdout chunk (state=3): >>>/root <<< 30575 1726867676.05823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.05849: stderr chunk (state=3): >>><<< 30575 1726867676.05853: stdout chunk (state=3): >>><<< 30575 1726867676.05870: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.05883: _low_level_execute_command(): starting 30575 1726867676.05888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293 `" && echo ansible-tmp-1726867676.0586941-35596-88253554755293="` echo /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293 `" ) && sleep 0' 30575 1726867676.06287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.06291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.06302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.06357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.06361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.06412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.08284: stdout chunk (state=3): >>>ansible-tmp-1726867676.0586941-35596-88253554755293=/root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293 <<< 30575 1726867676.08390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.08415: stderr chunk (state=3): >>><<< 30575 1726867676.08418: stdout chunk (state=3): >>><<< 30575 1726867676.08434: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867676.0586941-35596-88253554755293=/root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.08470: variable 'ansible_module_compression' from source: unknown 30575 1726867676.08505: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867676.08547: variable 'ansible_facts' from source: unknown 30575 1726867676.08639: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/AnsiballZ_network_connections.py 30575 1726867676.08736: Sending initial data 30575 1726867676.08739: Sent initial data (167 bytes) 30575 1726867676.09170: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.09173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867676.09175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.09183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.09185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.09228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867676.09244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.09285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.10803: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867676.10806: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867676.10843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867676.10888: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkmayisq1 /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/AnsiballZ_network_connections.py <<< 30575 1726867676.10896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/AnsiballZ_network_connections.py" <<< 30575 1726867676.10937: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkmayisq1" to remote "/root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/AnsiballZ_network_connections.py" <<< 30575 1726867676.11644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.11681: stderr chunk (state=3): >>><<< 30575 1726867676.11684: stdout chunk (state=3): >>><<< 30575 1726867676.11710: done transferring module to remote 30575 1726867676.11721: _low_level_execute_command(): starting 30575 1726867676.11724: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/ /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/AnsiballZ_network_connections.py && sleep 0' 30575 1726867676.12141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.12144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867676.12146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.12148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867676.12151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.12200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.12203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.12251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.13979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.14001: stderr chunk (state=3): >>><<< 30575 1726867676.14004: stdout chunk (state=3): >>><<< 30575 1726867676.14015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.14018: _low_level_execute_command(): starting 30575 1726867676.14025: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/AnsiballZ_network_connections.py && sleep 0' 30575 1726867676.14408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.14411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.14431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.14475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.14481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.14534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.39240: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867676.40932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867676.40959: stderr chunk (state=3): >>><<< 30575 1726867676.40962: stdout chunk (state=3): >>><<< 30575 1726867676.40979: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867676.41010: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867676.41020: _low_level_execute_command(): starting 30575 1726867676.41023: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867676.0586941-35596-88253554755293/ > /dev/null 2>&1 && sleep 0' 30575 1726867676.41480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867676.41483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867676.41485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.41489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867676.41491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.41545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.41552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867676.41553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.41595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.43412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.43439: stderr chunk (state=3): >>><<< 30575 1726867676.43442: stdout chunk (state=3): >>><<< 30575 1726867676.43454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.43461: handler run complete 30575 1726867676.43481: attempt loop complete, returning result 30575 1726867676.43484: _execute() done 30575 1726867676.43486: dumping result to json 30575 1726867676.43491: done dumping result, returning 30575 1726867676.43500: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-000000002338] 30575 1726867676.43504: sending task result for task 0affcac9-a3a5-e081-a588-000000002338 30575 1726867676.43606: done sending task result for task 0affcac9-a3a5-e081-a588-000000002338 30575 1726867676.43609: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6 skipped because already active 30575 1726867676.43710: no more pending results, returning what we have 30575 1726867676.43714: results queue empty 30575 1726867676.43714: checking for any_errors_fatal 30575 1726867676.43719: done checking for any_errors_fatal 30575 1726867676.43720: checking for max_fail_percentage 30575 1726867676.43722: done checking for max_fail_percentage 30575 1726867676.43722: checking to see if all hosts have failed and the running result is not ok 30575 1726867676.43723: done checking to see if all hosts have failed 30575 1726867676.43724: getting the remaining hosts for this loop 30575 1726867676.43726: done getting the remaining hosts for this loop 30575 1726867676.43729: getting the next task for host managed_node3 30575 1726867676.43737: done getting next task for host managed_node3 30575 1726867676.43740: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867676.43745: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867676.43757: getting variables 30575 1726867676.43758: in VariableManager get_vars() 30575 1726867676.43805: Calling all_inventory to load vars for managed_node3 30575 1726867676.43807: Calling groups_inventory to load vars for managed_node3 30575 1726867676.43809: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867676.43819: Calling all_plugins_play to load vars for managed_node3 30575 1726867676.43821: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867676.43824: Calling groups_plugins_play to load vars for managed_node3 30575 1726867676.44855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867676.45703: done with get_vars() 30575 1726867676.45719: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:56 -0400 (0:00:00.496) 0:01:51.835 ****** 30575 1726867676.45782: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867676.46019: worker is 1 (out of 1 available) 30575 1726867676.46033: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867676.46047: done queuing things up, now waiting for results queue to drain 30575 1726867676.46048: waiting for pending results... 30575 1726867676.46239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867676.46319: in run() - task 0affcac9-a3a5-e081-a588-000000002339 30575 1726867676.46334: variable 'ansible_search_path' from source: unknown 30575 1726867676.46338: variable 'ansible_search_path' from source: unknown 30575 1726867676.46366: calling self._execute() 30575 1726867676.46448: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.46452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.46463: variable 'omit' from source: magic vars 30575 1726867676.46748: variable 'ansible_distribution_major_version' from source: facts 30575 1726867676.46757: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867676.46842: variable 'network_state' from source: role '' defaults 30575 1726867676.46851: Evaluated conditional (network_state != {}): False 30575 1726867676.46854: when evaluation is False, skipping this task 30575 1726867676.46856: _execute() done 30575 1726867676.46859: dumping result to json 30575 1726867676.46864: done dumping result, returning 30575 1726867676.46871: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-000000002339] 30575 1726867676.46876: sending task result for task 0affcac9-a3a5-e081-a588-000000002339 30575 1726867676.46959: done sending task result for task 0affcac9-a3a5-e081-a588-000000002339 30575 1726867676.46962: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867676.47013: no more pending results, returning what we have 30575 1726867676.47018: results queue empty 30575 1726867676.47018: checking for any_errors_fatal 30575 1726867676.47029: done checking for any_errors_fatal 30575 1726867676.47029: checking for max_fail_percentage 30575 1726867676.47031: done checking for max_fail_percentage 30575 1726867676.47032: checking to see if all hosts have failed and the running result is not ok 30575 1726867676.47033: done checking to see if all hosts have failed 30575 1726867676.47034: getting the remaining hosts for this loop 30575 1726867676.47035: done getting the remaining hosts for this loop 30575 1726867676.47039: getting the next task for host managed_node3 30575 1726867676.47049: done getting next task for host managed_node3 30575 1726867676.47052: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867676.47057: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867676.47080: getting variables 30575 1726867676.47082: in VariableManager get_vars() 30575 1726867676.47119: Calling all_inventory to load vars for managed_node3 30575 1726867676.47121: Calling groups_inventory to load vars for managed_node3 30575 1726867676.47123: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867676.47132: Calling all_plugins_play to load vars for managed_node3 30575 1726867676.47135: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867676.47137: Calling groups_plugins_play to load vars for managed_node3 30575 1726867676.47894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867676.48750: done with get_vars() 30575 1726867676.48766: done getting variables 30575 1726867676.48811: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:56 -0400 (0:00:00.030) 0:01:51.865 ****** 30575 1726867676.48836: entering _queue_task() for managed_node3/debug 30575 1726867676.49071: worker is 1 (out of 1 available) 30575 1726867676.49087: exiting _queue_task() for managed_node3/debug 30575 1726867676.49099: done queuing things up, now waiting for results queue to drain 30575 1726867676.49101: waiting for pending results... 30575 1726867676.49297: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867676.49379: in run() - task 0affcac9-a3a5-e081-a588-00000000233a 30575 1726867676.49393: variable 'ansible_search_path' from source: unknown 30575 1726867676.49396: variable 'ansible_search_path' from source: unknown 30575 1726867676.49428: calling self._execute() 30575 1726867676.49510: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.49514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.49525: variable 'omit' from source: magic vars 30575 1726867676.49814: variable 'ansible_distribution_major_version' from source: facts 30575 1726867676.49825: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867676.49831: variable 'omit' from source: magic vars 30575 1726867676.49884: variable 'omit' from source: magic vars 30575 1726867676.49908: variable 'omit' from source: magic vars 30575 1726867676.49943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867676.49969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867676.49988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867676.50002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.50012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.50038: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867676.50041: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.50043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.50115: Set connection var ansible_pipelining to False 30575 1726867676.50118: Set connection var ansible_shell_type to sh 30575 1726867676.50126: Set connection var ansible_shell_executable to /bin/sh 30575 1726867676.50131: Set connection var ansible_timeout to 10 30575 1726867676.50136: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867676.50142: Set connection var ansible_connection to ssh 30575 1726867676.50160: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.50163: variable 'ansible_connection' from source: unknown 30575 1726867676.50165: variable 'ansible_module_compression' from source: unknown 30575 1726867676.50168: variable 'ansible_shell_type' from source: unknown 30575 1726867676.50170: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.50172: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.50176: variable 'ansible_pipelining' from source: unknown 30575 1726867676.50179: variable 'ansible_timeout' from source: unknown 30575 1726867676.50184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.50285: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867676.50295: variable 'omit' from source: magic vars 30575 1726867676.50298: starting attempt loop 30575 1726867676.50301: running the handler 30575 1726867676.50399: variable '__network_connections_result' from source: set_fact 30575 1726867676.50443: handler run complete 30575 1726867676.50455: attempt loop complete, returning result 30575 1726867676.50459: _execute() done 30575 1726867676.50462: dumping result to json 30575 1726867676.50465: done dumping result, returning 30575 1726867676.50473: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-00000000233a] 30575 1726867676.50479: sending task result for task 0affcac9-a3a5-e081-a588-00000000233a 30575 1726867676.50561: done sending task result for task 0affcac9-a3a5-e081-a588-00000000233a 30575 1726867676.50563: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6 skipped because already active" ] } 30575 1726867676.50635: no more pending results, returning what we have 30575 1726867676.50639: results queue empty 30575 1726867676.50639: checking for any_errors_fatal 30575 1726867676.50646: done checking for any_errors_fatal 30575 1726867676.50647: checking for max_fail_percentage 30575 1726867676.50648: done checking for max_fail_percentage 30575 1726867676.50649: checking to see if all hosts have failed and the running result is not ok 30575 1726867676.50649: done checking to see if all hosts have failed 30575 1726867676.50650: getting the remaining hosts for this loop 30575 1726867676.50651: done getting the remaining hosts for this loop 30575 1726867676.50654: getting the next task for host managed_node3 30575 1726867676.50662: done getting next task for host managed_node3 30575 1726867676.50666: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867676.50670: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867676.50688: getting variables 30575 1726867676.50690: in VariableManager get_vars() 30575 1726867676.50727: Calling all_inventory to load vars for managed_node3 30575 1726867676.50729: Calling groups_inventory to load vars for managed_node3 30575 1726867676.50732: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867676.50741: Calling all_plugins_play to load vars for managed_node3 30575 1726867676.50743: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867676.50746: Calling groups_plugins_play to load vars for managed_node3 30575 1726867676.51884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867676.53155: done with get_vars() 30575 1726867676.53171: done getting variables 30575 1726867676.53213: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:56 -0400 (0:00:00.044) 0:01:51.909 ****** 30575 1726867676.53241: entering _queue_task() for managed_node3/debug 30575 1726867676.53463: worker is 1 (out of 1 available) 30575 1726867676.53480: exiting _queue_task() for managed_node3/debug 30575 1726867676.53493: done queuing things up, now waiting for results queue to drain 30575 1726867676.53495: waiting for pending results... 30575 1726867676.53671: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867676.53762: in run() - task 0affcac9-a3a5-e081-a588-00000000233b 30575 1726867676.53773: variable 'ansible_search_path' from source: unknown 30575 1726867676.53776: variable 'ansible_search_path' from source: unknown 30575 1726867676.53804: calling self._execute() 30575 1726867676.53882: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.53886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.53897: variable 'omit' from source: magic vars 30575 1726867676.54166: variable 'ansible_distribution_major_version' from source: facts 30575 1726867676.54174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867676.54181: variable 'omit' from source: magic vars 30575 1726867676.54228: variable 'omit' from source: magic vars 30575 1726867676.54250: variable 'omit' from source: magic vars 30575 1726867676.54284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867676.54309: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867676.54327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867676.54341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.54352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.54375: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867676.54386: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.54389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.54455: Set connection var ansible_pipelining to False 30575 1726867676.54458: Set connection var ansible_shell_type to sh 30575 1726867676.54463: Set connection var ansible_shell_executable to /bin/sh 30575 1726867676.54469: Set connection var ansible_timeout to 10 30575 1726867676.54474: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867676.54487: Set connection var ansible_connection to ssh 30575 1726867676.54502: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.54505: variable 'ansible_connection' from source: unknown 30575 1726867676.54508: variable 'ansible_module_compression' from source: unknown 30575 1726867676.54510: variable 'ansible_shell_type' from source: unknown 30575 1726867676.54512: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.54514: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.54520: variable 'ansible_pipelining' from source: unknown 30575 1726867676.54524: variable 'ansible_timeout' from source: unknown 30575 1726867676.54526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.54629: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867676.54638: variable 'omit' from source: magic vars 30575 1726867676.54643: starting attempt loop 30575 1726867676.54646: running the handler 30575 1726867676.54685: variable '__network_connections_result' from source: set_fact 30575 1726867676.54743: variable '__network_connections_result' from source: set_fact 30575 1726867676.54821: handler run complete 30575 1726867676.54839: attempt loop complete, returning result 30575 1726867676.54842: _execute() done 30575 1726867676.54844: dumping result to json 30575 1726867676.54847: done dumping result, returning 30575 1726867676.54855: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-00000000233b] 30575 1726867676.54859: sending task result for task 0affcac9-a3a5-e081-a588-00000000233b 30575 1726867676.54944: done sending task result for task 0affcac9-a3a5-e081-a588-00000000233b 30575 1726867676.54947: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 0739a9ca-1102-4bed-b35d-0eb6b0f005e6 skipped because already active" ] } } 30575 1726867676.55034: no more pending results, returning what we have 30575 1726867676.55037: results queue empty 30575 1726867676.55038: checking for any_errors_fatal 30575 1726867676.55044: done checking for any_errors_fatal 30575 1726867676.55044: checking for max_fail_percentage 30575 1726867676.55046: done checking for max_fail_percentage 30575 1726867676.55046: checking to see if all hosts have failed and the running result is not ok 30575 1726867676.55047: done checking to see if all hosts have failed 30575 1726867676.55048: getting the remaining hosts for this loop 30575 1726867676.55049: done getting the remaining hosts for this loop 30575 1726867676.55052: getting the next task for host managed_node3 30575 1726867676.55061: done getting next task for host managed_node3 30575 1726867676.55064: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867676.55068: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867676.55081: getting variables 30575 1726867676.55082: in VariableManager get_vars() 30575 1726867676.55117: Calling all_inventory to load vars for managed_node3 30575 1726867676.55120: Calling groups_inventory to load vars for managed_node3 30575 1726867676.55126: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867676.55134: Calling all_plugins_play to load vars for managed_node3 30575 1726867676.55136: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867676.55138: Calling groups_plugins_play to load vars for managed_node3 30575 1726867676.55884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867676.56729: done with get_vars() 30575 1726867676.56745: done getting variables 30575 1726867676.56787: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:56 -0400 (0:00:00.035) 0:01:51.945 ****** 30575 1726867676.56810: entering _queue_task() for managed_node3/debug 30575 1726867676.57015: worker is 1 (out of 1 available) 30575 1726867676.57030: exiting _queue_task() for managed_node3/debug 30575 1726867676.57043: done queuing things up, now waiting for results queue to drain 30575 1726867676.57045: waiting for pending results... 30575 1726867676.57228: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867676.57326: in run() - task 0affcac9-a3a5-e081-a588-00000000233c 30575 1726867676.57338: variable 'ansible_search_path' from source: unknown 30575 1726867676.57341: variable 'ansible_search_path' from source: unknown 30575 1726867676.57366: calling self._execute() 30575 1726867676.57445: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.57449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.57458: variable 'omit' from source: magic vars 30575 1726867676.57729: variable 'ansible_distribution_major_version' from source: facts 30575 1726867676.57737: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867676.57820: variable 'network_state' from source: role '' defaults 30575 1726867676.57830: Evaluated conditional (network_state != {}): False 30575 1726867676.57832: when evaluation is False, skipping this task 30575 1726867676.57835: _execute() done 30575 1726867676.57837: dumping result to json 30575 1726867676.57842: done dumping result, returning 30575 1726867676.57850: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-00000000233c] 30575 1726867676.57854: sending task result for task 0affcac9-a3a5-e081-a588-00000000233c 30575 1726867676.57938: done sending task result for task 0affcac9-a3a5-e081-a588-00000000233c 30575 1726867676.57941: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867676.57991: no more pending results, returning what we have 30575 1726867676.57994: results queue empty 30575 1726867676.57995: checking for any_errors_fatal 30575 1726867676.58000: done checking for any_errors_fatal 30575 1726867676.58001: checking for max_fail_percentage 30575 1726867676.58003: done checking for max_fail_percentage 30575 1726867676.58004: checking to see if all hosts have failed and the running result is not ok 30575 1726867676.58005: done checking to see if all hosts have failed 30575 1726867676.58005: getting the remaining hosts for this loop 30575 1726867676.58007: done getting the remaining hosts for this loop 30575 1726867676.58010: getting the next task for host managed_node3 30575 1726867676.58017: done getting next task for host managed_node3 30575 1726867676.58021: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867676.58025: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867676.58044: getting variables 30575 1726867676.58045: in VariableManager get_vars() 30575 1726867676.58084: Calling all_inventory to load vars for managed_node3 30575 1726867676.58087: Calling groups_inventory to load vars for managed_node3 30575 1726867676.58088: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867676.58096: Calling all_plugins_play to load vars for managed_node3 30575 1726867676.58099: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867676.58101: Calling groups_plugins_play to load vars for managed_node3 30575 1726867676.58949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867676.59799: done with get_vars() 30575 1726867676.59816: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:56 -0400 (0:00:00.030) 0:01:51.976 ****** 30575 1726867676.59882: entering _queue_task() for managed_node3/ping 30575 1726867676.60098: worker is 1 (out of 1 available) 30575 1726867676.60112: exiting _queue_task() for managed_node3/ping 30575 1726867676.60125: done queuing things up, now waiting for results queue to drain 30575 1726867676.60127: waiting for pending results... 30575 1726867676.60314: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867676.60409: in run() - task 0affcac9-a3a5-e081-a588-00000000233d 30575 1726867676.60423: variable 'ansible_search_path' from source: unknown 30575 1726867676.60427: variable 'ansible_search_path' from source: unknown 30575 1726867676.60453: calling self._execute() 30575 1726867676.60530: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.60535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.60544: variable 'omit' from source: magic vars 30575 1726867676.60810: variable 'ansible_distribution_major_version' from source: facts 30575 1726867676.60818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867676.60828: variable 'omit' from source: magic vars 30575 1726867676.60874: variable 'omit' from source: magic vars 30575 1726867676.61004: variable 'omit' from source: magic vars 30575 1726867676.61007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867676.61010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867676.61013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867676.61015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.61017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867676.61019: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867676.61021: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.61023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.61091: Set connection var ansible_pipelining to False 30575 1726867676.61094: Set connection var ansible_shell_type to sh 30575 1726867676.61098: Set connection var ansible_shell_executable to /bin/sh 30575 1726867676.61105: Set connection var ansible_timeout to 10 30575 1726867676.61107: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867676.61115: Set connection var ansible_connection to ssh 30575 1726867676.61139: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.61142: variable 'ansible_connection' from source: unknown 30575 1726867676.61145: variable 'ansible_module_compression' from source: unknown 30575 1726867676.61147: variable 'ansible_shell_type' from source: unknown 30575 1726867676.61149: variable 'ansible_shell_executable' from source: unknown 30575 1726867676.61151: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.61153: variable 'ansible_pipelining' from source: unknown 30575 1726867676.61155: variable 'ansible_timeout' from source: unknown 30575 1726867676.61157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.61304: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867676.61313: variable 'omit' from source: magic vars 30575 1726867676.61318: starting attempt loop 30575 1726867676.61325: running the handler 30575 1726867676.61336: _low_level_execute_command(): starting 30575 1726867676.61343: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867676.61852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.61856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.61858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867676.61860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867676.61862: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.61916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.61919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867676.61922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.61983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.63656: stdout chunk (state=3): >>>/root <<< 30575 1726867676.63751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.63781: stderr chunk (state=3): >>><<< 30575 1726867676.63784: stdout chunk (state=3): >>><<< 30575 1726867676.63804: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.63816: _low_level_execute_command(): starting 30575 1726867676.63823: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269 `" && echo ansible-tmp-1726867676.6380267-35611-29268954813269="` echo /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269 `" ) && sleep 0' 30575 1726867676.64247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.64250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.64260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867676.64262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.64309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.64315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.64362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.66245: stdout chunk (state=3): >>>ansible-tmp-1726867676.6380267-35611-29268954813269=/root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269 <<< 30575 1726867676.66353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.66381: stderr chunk (state=3): >>><<< 30575 1726867676.66384: stdout chunk (state=3): >>><<< 30575 1726867676.66400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867676.6380267-35611-29268954813269=/root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.66438: variable 'ansible_module_compression' from source: unknown 30575 1726867676.66472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867676.66502: variable 'ansible_facts' from source: unknown 30575 1726867676.66559: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/AnsiballZ_ping.py 30575 1726867676.66656: Sending initial data 30575 1726867676.66661: Sent initial data (152 bytes) 30575 1726867676.67105: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867676.67108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867676.67110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.67113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.67115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.67161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.67168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.67213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.68733: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30575 1726867676.68736: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867676.68771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867676.68815: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp8kxbw30f /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/AnsiballZ_ping.py <<< 30575 1726867676.68822: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/AnsiballZ_ping.py" <<< 30575 1726867676.68857: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp8kxbw30f" to remote "/root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/AnsiballZ_ping.py" <<< 30575 1726867676.69361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.69399: stderr chunk (state=3): >>><<< 30575 1726867676.69402: stdout chunk (state=3): >>><<< 30575 1726867676.69425: done transferring module to remote 30575 1726867676.69432: _low_level_execute_command(): starting 30575 1726867676.69435: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/ /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/AnsiballZ_ping.py && sleep 0' 30575 1726867676.69862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867676.69866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867676.69868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.69870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.69875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.69922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.69925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.69975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.71682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.71708: stderr chunk (state=3): >>><<< 30575 1726867676.71711: stdout chunk (state=3): >>><<< 30575 1726867676.71726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.71729: _low_level_execute_command(): starting 30575 1726867676.71731: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/AnsiballZ_ping.py && sleep 0' 30575 1726867676.72160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.72163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867676.72165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867676.72167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.72169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.72217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.72221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.72276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.87287: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867676.88649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867676.88676: stderr chunk (state=3): >>><<< 30575 1726867676.88682: stdout chunk (state=3): >>><<< 30575 1726867676.88695: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867676.88720: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867676.88730: _low_level_execute_command(): starting 30575 1726867676.88733: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867676.6380267-35611-29268954813269/ > /dev/null 2>&1 && sleep 0' 30575 1726867676.89168: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867676.89171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867676.89204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.89207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867676.89209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867676.89212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867676.89268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867676.89271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867676.89285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867676.89325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867676.91209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867676.91234: stderr chunk (state=3): >>><<< 30575 1726867676.91237: stdout chunk (state=3): >>><<< 30575 1726867676.91252: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867676.91258: handler run complete 30575 1726867676.91272: attempt loop complete, returning result 30575 1726867676.91275: _execute() done 30575 1726867676.91279: dumping result to json 30575 1726867676.91281: done dumping result, returning 30575 1726867676.91291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-00000000233d] 30575 1726867676.91295: sending task result for task 0affcac9-a3a5-e081-a588-00000000233d 30575 1726867676.91387: done sending task result for task 0affcac9-a3a5-e081-a588-00000000233d 30575 1726867676.91391: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867676.91491: no more pending results, returning what we have 30575 1726867676.91494: results queue empty 30575 1726867676.91495: checking for any_errors_fatal 30575 1726867676.91501: done checking for any_errors_fatal 30575 1726867676.91502: checking for max_fail_percentage 30575 1726867676.91503: done checking for max_fail_percentage 30575 1726867676.91504: checking to see if all hosts have failed and the running result is not ok 30575 1726867676.91505: done checking to see if all hosts have failed 30575 1726867676.91506: getting the remaining hosts for this loop 30575 1726867676.91507: done getting the remaining hosts for this loop 30575 1726867676.91511: getting the next task for host managed_node3 30575 1726867676.91522: done getting next task for host managed_node3 30575 1726867676.91524: ^ task is: TASK: meta (role_complete) 30575 1726867676.91530: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867676.91543: getting variables 30575 1726867676.91544: in VariableManager get_vars() 30575 1726867676.91593: Calling all_inventory to load vars for managed_node3 30575 1726867676.91595: Calling groups_inventory to load vars for managed_node3 30575 1726867676.91597: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867676.91607: Calling all_plugins_play to load vars for managed_node3 30575 1726867676.91609: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867676.91611: Calling groups_plugins_play to load vars for managed_node3 30575 1726867676.92596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867676.97623: done with get_vars() 30575 1726867676.97640: done getting variables 30575 1726867676.97690: done queuing things up, now waiting for results queue to drain 30575 1726867676.97692: results queue empty 30575 1726867676.97692: checking for any_errors_fatal 30575 1726867676.97694: done checking for any_errors_fatal 30575 1726867676.97694: checking for max_fail_percentage 30575 1726867676.97695: done checking for max_fail_percentage 30575 1726867676.97695: checking to see if all hosts have failed and the running result is not ok 30575 1726867676.97696: done checking to see if all hosts have failed 30575 1726867676.97696: getting the remaining hosts for this loop 30575 1726867676.97697: done getting the remaining hosts for this loop 30575 1726867676.97699: getting the next task for host managed_node3 30575 1726867676.97703: done getting next task for host managed_node3 30575 1726867676.97705: ^ task is: TASK: Include network role 30575 1726867676.97707: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867676.97709: getting variables 30575 1726867676.97709: in VariableManager get_vars() 30575 1726867676.97719: Calling all_inventory to load vars for managed_node3 30575 1726867676.97721: Calling groups_inventory to load vars for managed_node3 30575 1726867676.97722: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867676.97726: Calling all_plugins_play to load vars for managed_node3 30575 1726867676.97727: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867676.97729: Calling groups_plugins_play to load vars for managed_node3 30575 1726867676.98342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867676.99174: done with get_vars() 30575 1726867676.99189: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 17:27:56 -0400 (0:00:00.393) 0:01:52.369 ****** 30575 1726867676.99241: entering _queue_task() for managed_node3/include_role 30575 1726867676.99520: worker is 1 (out of 1 available) 30575 1726867676.99533: exiting _queue_task() for managed_node3/include_role 30575 1726867676.99546: done queuing things up, now waiting for results queue to drain 30575 1726867676.99548: waiting for pending results... 30575 1726867676.99742: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867676.99835: in run() - task 0affcac9-a3a5-e081-a588-000000002142 30575 1726867676.99847: variable 'ansible_search_path' from source: unknown 30575 1726867676.99850: variable 'ansible_search_path' from source: unknown 30575 1726867676.99883: calling self._execute() 30575 1726867676.99960: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867676.99965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867676.99974: variable 'omit' from source: magic vars 30575 1726867677.00253: variable 'ansible_distribution_major_version' from source: facts 30575 1726867677.00262: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867677.00269: _execute() done 30575 1726867677.00272: dumping result to json 30575 1726867677.00275: done dumping result, returning 30575 1726867677.00286: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-000000002142] 30575 1726867677.00291: sending task result for task 0affcac9-a3a5-e081-a588-000000002142 30575 1726867677.00421: no more pending results, returning what we have 30575 1726867677.00427: in VariableManager get_vars() 30575 1726867677.00480: Calling all_inventory to load vars for managed_node3 30575 1726867677.00482: Calling groups_inventory to load vars for managed_node3 30575 1726867677.00486: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867677.00498: Calling all_plugins_play to load vars for managed_node3 30575 1726867677.00501: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867677.00504: Calling groups_plugins_play to load vars for managed_node3 30575 1726867677.01419: done sending task result for task 0affcac9-a3a5-e081-a588-000000002142 30575 1726867677.01423: WORKER PROCESS EXITING 30575 1726867677.01440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867677.02295: done with get_vars() 30575 1726867677.02308: variable 'ansible_search_path' from source: unknown 30575 1726867677.02309: variable 'ansible_search_path' from source: unknown 30575 1726867677.02403: variable 'omit' from source: magic vars 30575 1726867677.02431: variable 'omit' from source: magic vars 30575 1726867677.02441: variable 'omit' from source: magic vars 30575 1726867677.02443: we have included files to process 30575 1726867677.02444: generating all_blocks data 30575 1726867677.02446: done generating all_blocks data 30575 1726867677.02449: processing included file: fedora.linux_system_roles.network 30575 1726867677.02462: in VariableManager get_vars() 30575 1726867677.02472: done with get_vars() 30575 1726867677.02492: in VariableManager get_vars() 30575 1726867677.02503: done with get_vars() 30575 1726867677.02531: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867677.02600: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867677.02649: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867677.02911: in VariableManager get_vars() 30575 1726867677.02926: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867677.04130: iterating over new_blocks loaded from include file 30575 1726867677.04132: in VariableManager get_vars() 30575 1726867677.04143: done with get_vars() 30575 1726867677.04144: filtering new block on tags 30575 1726867677.04300: done filtering new block on tags 30575 1726867677.04302: in VariableManager get_vars() 30575 1726867677.04312: done with get_vars() 30575 1726867677.04313: filtering new block on tags 30575 1726867677.04325: done filtering new block on tags 30575 1726867677.04327: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867677.04330: extending task lists for all hosts with included blocks 30575 1726867677.04393: done extending task lists 30575 1726867677.04394: done processing included files 30575 1726867677.04394: results queue empty 30575 1726867677.04395: checking for any_errors_fatal 30575 1726867677.04396: done checking for any_errors_fatal 30575 1726867677.04396: checking for max_fail_percentage 30575 1726867677.04397: done checking for max_fail_percentage 30575 1726867677.04397: checking to see if all hosts have failed and the running result is not ok 30575 1726867677.04398: done checking to see if all hosts have failed 30575 1726867677.04398: getting the remaining hosts for this loop 30575 1726867677.04399: done getting the remaining hosts for this loop 30575 1726867677.04401: getting the next task for host managed_node3 30575 1726867677.04404: done getting next task for host managed_node3 30575 1726867677.04405: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867677.04407: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867677.04414: getting variables 30575 1726867677.04415: in VariableManager get_vars() 30575 1726867677.04426: Calling all_inventory to load vars for managed_node3 30575 1726867677.04427: Calling groups_inventory to load vars for managed_node3 30575 1726867677.04429: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867677.04432: Calling all_plugins_play to load vars for managed_node3 30575 1726867677.04433: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867677.04435: Calling groups_plugins_play to load vars for managed_node3 30575 1726867677.05050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867677.05964: done with get_vars() 30575 1726867677.05979: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:57 -0400 (0:00:00.067) 0:01:52.437 ****** 30575 1726867677.06029: entering _queue_task() for managed_node3/include_tasks 30575 1726867677.06275: worker is 1 (out of 1 available) 30575 1726867677.06289: exiting _queue_task() for managed_node3/include_tasks 30575 1726867677.06303: done queuing things up, now waiting for results queue to drain 30575 1726867677.06305: waiting for pending results... 30575 1726867677.06489: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867677.06575: in run() - task 0affcac9-a3a5-e081-a588-0000000024a4 30575 1726867677.06588: variable 'ansible_search_path' from source: unknown 30575 1726867677.06591: variable 'ansible_search_path' from source: unknown 30575 1726867677.06622: calling self._execute() 30575 1726867677.06700: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867677.06705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867677.06712: variable 'omit' from source: magic vars 30575 1726867677.06988: variable 'ansible_distribution_major_version' from source: facts 30575 1726867677.06997: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867677.07003: _execute() done 30575 1726867677.07007: dumping result to json 30575 1726867677.07010: done dumping result, returning 30575 1726867677.07020: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-0000000024a4] 30575 1726867677.07023: sending task result for task 0affcac9-a3a5-e081-a588-0000000024a4 30575 1726867677.07106: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024a4 30575 1726867677.07109: WORKER PROCESS EXITING 30575 1726867677.07164: no more pending results, returning what we have 30575 1726867677.07170: in VariableManager get_vars() 30575 1726867677.07224: Calling all_inventory to load vars for managed_node3 30575 1726867677.07227: Calling groups_inventory to load vars for managed_node3 30575 1726867677.07229: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867677.07239: Calling all_plugins_play to load vars for managed_node3 30575 1726867677.07241: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867677.07243: Calling groups_plugins_play to load vars for managed_node3 30575 1726867677.07993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867677.08852: done with get_vars() 30575 1726867677.08866: variable 'ansible_search_path' from source: unknown 30575 1726867677.08867: variable 'ansible_search_path' from source: unknown 30575 1726867677.08892: we have included files to process 30575 1726867677.08892: generating all_blocks data 30575 1726867677.08894: done generating all_blocks data 30575 1726867677.08896: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867677.08897: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867677.08899: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867677.09257: done processing included file 30575 1726867677.09258: iterating over new_blocks loaded from include file 30575 1726867677.09259: in VariableManager get_vars() 30575 1726867677.09275: done with get_vars() 30575 1726867677.09278: filtering new block on tags 30575 1726867677.09296: done filtering new block on tags 30575 1726867677.09298: in VariableManager get_vars() 30575 1726867677.09311: done with get_vars() 30575 1726867677.09312: filtering new block on tags 30575 1726867677.09340: done filtering new block on tags 30575 1726867677.09342: in VariableManager get_vars() 30575 1726867677.09356: done with get_vars() 30575 1726867677.09357: filtering new block on tags 30575 1726867677.09381: done filtering new block on tags 30575 1726867677.09383: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867677.09386: extending task lists for all hosts with included blocks 30575 1726867677.10320: done extending task lists 30575 1726867677.10321: done processing included files 30575 1726867677.10322: results queue empty 30575 1726867677.10322: checking for any_errors_fatal 30575 1726867677.10324: done checking for any_errors_fatal 30575 1726867677.10324: checking for max_fail_percentage 30575 1726867677.10325: done checking for max_fail_percentage 30575 1726867677.10326: checking to see if all hosts have failed and the running result is not ok 30575 1726867677.10326: done checking to see if all hosts have failed 30575 1726867677.10327: getting the remaining hosts for this loop 30575 1726867677.10327: done getting the remaining hosts for this loop 30575 1726867677.10329: getting the next task for host managed_node3 30575 1726867677.10332: done getting next task for host managed_node3 30575 1726867677.10334: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867677.10337: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867677.10344: getting variables 30575 1726867677.10345: in VariableManager get_vars() 30575 1726867677.10354: Calling all_inventory to load vars for managed_node3 30575 1726867677.10355: Calling groups_inventory to load vars for managed_node3 30575 1726867677.10357: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867677.10360: Calling all_plugins_play to load vars for managed_node3 30575 1726867677.10361: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867677.10363: Calling groups_plugins_play to load vars for managed_node3 30575 1726867677.11007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867677.11846: done with get_vars() 30575 1726867677.11860: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:27:57 -0400 (0:00:00.058) 0:01:52.496 ****** 30575 1726867677.11909: entering _queue_task() for managed_node3/setup 30575 1726867677.12140: worker is 1 (out of 1 available) 30575 1726867677.12152: exiting _queue_task() for managed_node3/setup 30575 1726867677.12165: done queuing things up, now waiting for results queue to drain 30575 1726867677.12167: waiting for pending results... 30575 1726867677.12344: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867677.12435: in run() - task 0affcac9-a3a5-e081-a588-0000000024fb 30575 1726867677.12448: variable 'ansible_search_path' from source: unknown 30575 1726867677.12452: variable 'ansible_search_path' from source: unknown 30575 1726867677.12481: calling self._execute() 30575 1726867677.12557: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867677.12563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867677.12571: variable 'omit' from source: magic vars 30575 1726867677.12839: variable 'ansible_distribution_major_version' from source: facts 30575 1726867677.12848: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867677.12987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867677.14419: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867677.14459: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867677.14493: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867677.14519: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867677.14540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867677.14601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867677.14624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867677.14641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867677.14667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867677.14681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867677.14716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867677.14735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867677.14752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867677.14776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867677.14791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867677.14896: variable '__network_required_facts' from source: role '' defaults 30575 1726867677.14904: variable 'ansible_facts' from source: unknown 30575 1726867677.15357: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867677.15361: when evaluation is False, skipping this task 30575 1726867677.15363: _execute() done 30575 1726867677.15366: dumping result to json 30575 1726867677.15368: done dumping result, returning 30575 1726867677.15376: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-0000000024fb] 30575 1726867677.15382: sending task result for task 0affcac9-a3a5-e081-a588-0000000024fb 30575 1726867677.15458: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024fb 30575 1726867677.15460: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867677.15504: no more pending results, returning what we have 30575 1726867677.15508: results queue empty 30575 1726867677.15508: checking for any_errors_fatal 30575 1726867677.15509: done checking for any_errors_fatal 30575 1726867677.15510: checking for max_fail_percentage 30575 1726867677.15512: done checking for max_fail_percentage 30575 1726867677.15513: checking to see if all hosts have failed and the running result is not ok 30575 1726867677.15513: done checking to see if all hosts have failed 30575 1726867677.15514: getting the remaining hosts for this loop 30575 1726867677.15516: done getting the remaining hosts for this loop 30575 1726867677.15519: getting the next task for host managed_node3 30575 1726867677.15530: done getting next task for host managed_node3 30575 1726867677.15534: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867677.15540: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867677.15569: getting variables 30575 1726867677.15570: in VariableManager get_vars() 30575 1726867677.15615: Calling all_inventory to load vars for managed_node3 30575 1726867677.15617: Calling groups_inventory to load vars for managed_node3 30575 1726867677.15619: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867677.15627: Calling all_plugins_play to load vars for managed_node3 30575 1726867677.15630: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867677.15639: Calling groups_plugins_play to load vars for managed_node3 30575 1726867677.16423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867677.17292: done with get_vars() 30575 1726867677.17309: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:27:57 -0400 (0:00:00.054) 0:01:52.551 ****** 30575 1726867677.17373: entering _queue_task() for managed_node3/stat 30575 1726867677.17598: worker is 1 (out of 1 available) 30575 1726867677.17613: exiting _queue_task() for managed_node3/stat 30575 1726867677.17626: done queuing things up, now waiting for results queue to drain 30575 1726867677.17628: waiting for pending results... 30575 1726867677.17813: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867677.17912: in run() - task 0affcac9-a3a5-e081-a588-0000000024fd 30575 1726867677.17926: variable 'ansible_search_path' from source: unknown 30575 1726867677.17930: variable 'ansible_search_path' from source: unknown 30575 1726867677.17962: calling self._execute() 30575 1726867677.18041: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867677.18044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867677.18052: variable 'omit' from source: magic vars 30575 1726867677.18332: variable 'ansible_distribution_major_version' from source: facts 30575 1726867677.18341: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867677.18457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867677.18656: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867677.18689: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867677.18712: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867677.18740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867677.18801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867677.18819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867677.18883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867677.18887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867677.18927: variable '__network_is_ostree' from source: set_fact 30575 1726867677.18932: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867677.18935: when evaluation is False, skipping this task 30575 1726867677.18937: _execute() done 30575 1726867677.18947: dumping result to json 30575 1726867677.18950: done dumping result, returning 30575 1726867677.18953: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-0000000024fd] 30575 1726867677.18958: sending task result for task 0affcac9-a3a5-e081-a588-0000000024fd 30575 1726867677.19034: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024fd 30575 1726867677.19036: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867677.19104: no more pending results, returning what we have 30575 1726867677.19107: results queue empty 30575 1726867677.19108: checking for any_errors_fatal 30575 1726867677.19114: done checking for any_errors_fatal 30575 1726867677.19115: checking for max_fail_percentage 30575 1726867677.19116: done checking for max_fail_percentage 30575 1726867677.19117: checking to see if all hosts have failed and the running result is not ok 30575 1726867677.19118: done checking to see if all hosts have failed 30575 1726867677.19119: getting the remaining hosts for this loop 30575 1726867677.19120: done getting the remaining hosts for this loop 30575 1726867677.19124: getting the next task for host managed_node3 30575 1726867677.19132: done getting next task for host managed_node3 30575 1726867677.19135: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867677.19140: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867677.19163: getting variables 30575 1726867677.19165: in VariableManager get_vars() 30575 1726867677.19201: Calling all_inventory to load vars for managed_node3 30575 1726867677.19204: Calling groups_inventory to load vars for managed_node3 30575 1726867677.19206: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867677.19213: Calling all_plugins_play to load vars for managed_node3 30575 1726867677.19216: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867677.19218: Calling groups_plugins_play to load vars for managed_node3 30575 1726867677.20069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867677.20917: done with get_vars() 30575 1726867677.20931: done getting variables 30575 1726867677.20971: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:27:57 -0400 (0:00:00.036) 0:01:52.587 ****** 30575 1726867677.21000: entering _queue_task() for managed_node3/set_fact 30575 1726867677.21215: worker is 1 (out of 1 available) 30575 1726867677.21227: exiting _queue_task() for managed_node3/set_fact 30575 1726867677.21242: done queuing things up, now waiting for results queue to drain 30575 1726867677.21244: waiting for pending results... 30575 1726867677.21422: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867677.21519: in run() - task 0affcac9-a3a5-e081-a588-0000000024fe 30575 1726867677.21533: variable 'ansible_search_path' from source: unknown 30575 1726867677.21536: variable 'ansible_search_path' from source: unknown 30575 1726867677.21563: calling self._execute() 30575 1726867677.21641: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867677.21645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867677.21654: variable 'omit' from source: magic vars 30575 1726867677.21929: variable 'ansible_distribution_major_version' from source: facts 30575 1726867677.21937: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867677.22052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867677.22245: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867677.22276: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867677.22301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867677.22327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867677.22392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867677.22409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867677.22430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867677.22449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867677.22519: variable '__network_is_ostree' from source: set_fact 30575 1726867677.22528: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867677.22531: when evaluation is False, skipping this task 30575 1726867677.22533: _execute() done 30575 1726867677.22536: dumping result to json 30575 1726867677.22539: done dumping result, returning 30575 1726867677.22546: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-0000000024fe] 30575 1726867677.22551: sending task result for task 0affcac9-a3a5-e081-a588-0000000024fe 30575 1726867677.22627: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024fe 30575 1726867677.22631: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867677.22704: no more pending results, returning what we have 30575 1726867677.22707: results queue empty 30575 1726867677.22708: checking for any_errors_fatal 30575 1726867677.22712: done checking for any_errors_fatal 30575 1726867677.22713: checking for max_fail_percentage 30575 1726867677.22714: done checking for max_fail_percentage 30575 1726867677.22715: checking to see if all hosts have failed and the running result is not ok 30575 1726867677.22716: done checking to see if all hosts have failed 30575 1726867677.22716: getting the remaining hosts for this loop 30575 1726867677.22717: done getting the remaining hosts for this loop 30575 1726867677.22720: getting the next task for host managed_node3 30575 1726867677.22729: done getting next task for host managed_node3 30575 1726867677.22732: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867677.22737: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867677.22758: getting variables 30575 1726867677.22760: in VariableManager get_vars() 30575 1726867677.22796: Calling all_inventory to load vars for managed_node3 30575 1726867677.22799: Calling groups_inventory to load vars for managed_node3 30575 1726867677.22801: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867677.22808: Calling all_plugins_play to load vars for managed_node3 30575 1726867677.22811: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867677.22813: Calling groups_plugins_play to load vars for managed_node3 30575 1726867677.23549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867677.24514: done with get_vars() 30575 1726867677.24529: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:27:57 -0400 (0:00:00.035) 0:01:52.623 ****** 30575 1726867677.24593: entering _queue_task() for managed_node3/service_facts 30575 1726867677.24789: worker is 1 (out of 1 available) 30575 1726867677.24805: exiting _queue_task() for managed_node3/service_facts 30575 1726867677.24819: done queuing things up, now waiting for results queue to drain 30575 1726867677.24820: waiting for pending results... 30575 1726867677.24992: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867677.25092: in run() - task 0affcac9-a3a5-e081-a588-000000002500 30575 1726867677.25103: variable 'ansible_search_path' from source: unknown 30575 1726867677.25106: variable 'ansible_search_path' from source: unknown 30575 1726867677.25135: calling self._execute() 30575 1726867677.25205: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867677.25208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867677.25217: variable 'omit' from source: magic vars 30575 1726867677.25481: variable 'ansible_distribution_major_version' from source: facts 30575 1726867677.25490: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867677.25496: variable 'omit' from source: magic vars 30575 1726867677.25546: variable 'omit' from source: magic vars 30575 1726867677.25569: variable 'omit' from source: magic vars 30575 1726867677.25602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867677.25630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867677.25644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867677.25657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867677.25669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867677.25694: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867677.25696: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867677.25701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867677.25768: Set connection var ansible_pipelining to False 30575 1726867677.25772: Set connection var ansible_shell_type to sh 30575 1726867677.25775: Set connection var ansible_shell_executable to /bin/sh 30575 1726867677.25783: Set connection var ansible_timeout to 10 30575 1726867677.25787: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867677.25794: Set connection var ansible_connection to ssh 30575 1726867677.25812: variable 'ansible_shell_executable' from source: unknown 30575 1726867677.25815: variable 'ansible_connection' from source: unknown 30575 1726867677.25818: variable 'ansible_module_compression' from source: unknown 30575 1726867677.25820: variable 'ansible_shell_type' from source: unknown 30575 1726867677.25828: variable 'ansible_shell_executable' from source: unknown 30575 1726867677.25830: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867677.25832: variable 'ansible_pipelining' from source: unknown 30575 1726867677.25834: variable 'ansible_timeout' from source: unknown 30575 1726867677.25836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867677.25970: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867677.25980: variable 'omit' from source: magic vars 30575 1726867677.25985: starting attempt loop 30575 1726867677.25987: running the handler 30575 1726867677.25998: _low_level_execute_command(): starting 30575 1726867677.26004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867677.26516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867677.26522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867677.26526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867677.26528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.26571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867677.26574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867677.26576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867677.26637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867677.28349: stdout chunk (state=3): >>>/root <<< 30575 1726867677.28455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867677.28482: stderr chunk (state=3): >>><<< 30575 1726867677.28488: stdout chunk (state=3): >>><<< 30575 1726867677.28507: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867677.28520: _low_level_execute_command(): starting 30575 1726867677.28524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876 `" && echo ansible-tmp-1726867677.2850654-35624-181749693298876="` echo /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876 `" ) && sleep 0' 30575 1726867677.28942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867677.28945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867677.28948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.28957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867677.28960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.29003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867677.29006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867677.29056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867677.30956: stdout chunk (state=3): >>>ansible-tmp-1726867677.2850654-35624-181749693298876=/root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876 <<< 30575 1726867677.31065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867677.31088: stderr chunk (state=3): >>><<< 30575 1726867677.31091: stdout chunk (state=3): >>><<< 30575 1726867677.31104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867677.2850654-35624-181749693298876=/root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867677.31142: variable 'ansible_module_compression' from source: unknown 30575 1726867677.31174: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867677.31206: variable 'ansible_facts' from source: unknown 30575 1726867677.31266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/AnsiballZ_service_facts.py 30575 1726867677.31358: Sending initial data 30575 1726867677.31362: Sent initial data (162 bytes) 30575 1726867677.31784: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867677.31787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.31790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867677.31792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.31843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867677.31846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867677.31895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867677.33474: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867677.33479: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867677.33514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867677.33556: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpykre69ee /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/AnsiballZ_service_facts.py <<< 30575 1726867677.33564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/AnsiballZ_service_facts.py" <<< 30575 1726867677.33600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpykre69ee" to remote "/root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/AnsiballZ_service_facts.py" <<< 30575 1726867677.34157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867677.34195: stderr chunk (state=3): >>><<< 30575 1726867677.34199: stdout chunk (state=3): >>><<< 30575 1726867677.34259: done transferring module to remote 30575 1726867677.34268: _low_level_execute_command(): starting 30575 1726867677.34271: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/ /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/AnsiballZ_service_facts.py && sleep 0' 30575 1726867677.34681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867677.34684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.34686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867677.34688: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867677.34694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.34739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867677.34742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867677.34792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867677.36572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867677.36593: stderr chunk (state=3): >>><<< 30575 1726867677.36596: stdout chunk (state=3): >>><<< 30575 1726867677.36608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867677.36611: _low_level_execute_command(): starting 30575 1726867677.36615: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/AnsiballZ_service_facts.py && sleep 0' 30575 1726867677.37018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867677.37021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867677.37024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.37026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867677.37028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867677.37076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867677.37086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867677.37132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867678.87801: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30575 1726867678.87834: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30575 1726867678.87840: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30575 1726867678.87845: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 30575 1726867678.87861: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30575 1726867678.87884: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867678.89335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867678.89368: stderr chunk (state=3): >>><<< 30575 1726867678.89371: stdout chunk (state=3): >>><<< 30575 1726867678.89402: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867678.89854: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867678.89862: _low_level_execute_command(): starting 30575 1726867678.89867: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867677.2850654-35624-181749693298876/ > /dev/null 2>&1 && sleep 0' 30575 1726867678.90318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867678.90323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867678.90325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867678.90327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867678.90329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867678.90383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867678.90386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867678.90394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867678.90437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867678.92220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867678.92243: stderr chunk (state=3): >>><<< 30575 1726867678.92246: stdout chunk (state=3): >>><<< 30575 1726867678.92258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867678.92265: handler run complete 30575 1726867678.92381: variable 'ansible_facts' from source: unknown 30575 1726867678.92475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867678.92756: variable 'ansible_facts' from source: unknown 30575 1726867678.92841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867678.92955: attempt loop complete, returning result 30575 1726867678.92959: _execute() done 30575 1726867678.92961: dumping result to json 30575 1726867678.92998: done dumping result, returning 30575 1726867678.93006: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-000000002500] 30575 1726867678.93014: sending task result for task 0affcac9-a3a5-e081-a588-000000002500 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867678.93647: no more pending results, returning what we have 30575 1726867678.93650: results queue empty 30575 1726867678.93651: checking for any_errors_fatal 30575 1726867678.93654: done checking for any_errors_fatal 30575 1726867678.93655: checking for max_fail_percentage 30575 1726867678.93656: done checking for max_fail_percentage 30575 1726867678.93657: checking to see if all hosts have failed and the running result is not ok 30575 1726867678.93658: done checking to see if all hosts have failed 30575 1726867678.93658: getting the remaining hosts for this loop 30575 1726867678.93659: done getting the remaining hosts for this loop 30575 1726867678.93663: getting the next task for host managed_node3 30575 1726867678.93670: done getting next task for host managed_node3 30575 1726867678.93673: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867678.93683: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867678.93697: done sending task result for task 0affcac9-a3a5-e081-a588-000000002500 30575 1726867678.93700: WORKER PROCESS EXITING 30575 1726867678.93707: getting variables 30575 1726867678.93708: in VariableManager get_vars() 30575 1726867678.93736: Calling all_inventory to load vars for managed_node3 30575 1726867678.93737: Calling groups_inventory to load vars for managed_node3 30575 1726867678.93739: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867678.93745: Calling all_plugins_play to load vars for managed_node3 30575 1726867678.93747: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867678.93752: Calling groups_plugins_play to load vars for managed_node3 30575 1726867678.94568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867678.95429: done with get_vars() 30575 1726867678.95445: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:27:58 -0400 (0:00:01.709) 0:01:54.332 ****** 30575 1726867678.95515: entering _queue_task() for managed_node3/package_facts 30575 1726867678.95740: worker is 1 (out of 1 available) 30575 1726867678.95752: exiting _queue_task() for managed_node3/package_facts 30575 1726867678.95765: done queuing things up, now waiting for results queue to drain 30575 1726867678.95767: waiting for pending results... 30575 1726867678.95960: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867678.96067: in run() - task 0affcac9-a3a5-e081-a588-000000002501 30575 1726867678.96082: variable 'ansible_search_path' from source: unknown 30575 1726867678.96086: variable 'ansible_search_path' from source: unknown 30575 1726867678.96114: calling self._execute() 30575 1726867678.96190: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867678.96194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867678.96205: variable 'omit' from source: magic vars 30575 1726867678.96483: variable 'ansible_distribution_major_version' from source: facts 30575 1726867678.96493: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867678.96499: variable 'omit' from source: magic vars 30575 1726867678.96554: variable 'omit' from source: magic vars 30575 1726867678.96579: variable 'omit' from source: magic vars 30575 1726867678.96609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867678.96637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867678.96654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867678.96667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867678.96680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867678.96704: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867678.96707: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867678.96709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867678.96782: Set connection var ansible_pipelining to False 30575 1726867678.96785: Set connection var ansible_shell_type to sh 30575 1726867678.96790: Set connection var ansible_shell_executable to /bin/sh 30575 1726867678.96795: Set connection var ansible_timeout to 10 30575 1726867678.96800: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867678.96806: Set connection var ansible_connection to ssh 30575 1726867678.96827: variable 'ansible_shell_executable' from source: unknown 30575 1726867678.96830: variable 'ansible_connection' from source: unknown 30575 1726867678.96833: variable 'ansible_module_compression' from source: unknown 30575 1726867678.96835: variable 'ansible_shell_type' from source: unknown 30575 1726867678.96837: variable 'ansible_shell_executable' from source: unknown 30575 1726867678.96839: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867678.96842: variable 'ansible_pipelining' from source: unknown 30575 1726867678.96846: variable 'ansible_timeout' from source: unknown 30575 1726867678.96849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867678.96991: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867678.97000: variable 'omit' from source: magic vars 30575 1726867678.97005: starting attempt loop 30575 1726867678.97008: running the handler 30575 1726867678.97020: _low_level_execute_command(): starting 30575 1726867678.97028: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867678.97536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867678.97540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867678.97543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867678.97546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867678.97598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867678.97602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867678.97608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867678.97651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867678.99240: stdout chunk (state=3): >>>/root <<< 30575 1726867678.99338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867678.99363: stderr chunk (state=3): >>><<< 30575 1726867678.99366: stdout chunk (state=3): >>><<< 30575 1726867678.99389: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867678.99399: _low_level_execute_command(): starting 30575 1726867678.99405: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615 `" && echo ansible-tmp-1726867678.9938781-35638-127987587688615="` echo /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615 `" ) && sleep 0' 30575 1726867678.99824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867678.99827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867678.99836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867678.99838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867678.99887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867678.99891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867678.99942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867679.01811: stdout chunk (state=3): >>>ansible-tmp-1726867678.9938781-35638-127987587688615=/root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615 <<< 30575 1726867679.01920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867679.01944: stderr chunk (state=3): >>><<< 30575 1726867679.01947: stdout chunk (state=3): >>><<< 30575 1726867679.01960: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867678.9938781-35638-127987587688615=/root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867679.02001: variable 'ansible_module_compression' from source: unknown 30575 1726867679.02042: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867679.02098: variable 'ansible_facts' from source: unknown 30575 1726867679.02216: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/AnsiballZ_package_facts.py 30575 1726867679.02319: Sending initial data 30575 1726867679.02323: Sent initial data (162 bytes) 30575 1726867679.02763: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867679.02767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867679.02769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.02771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867679.02773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.02820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867679.02824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867679.02876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867679.04409: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867679.04415: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867679.04450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867679.04499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpe_i3acvf /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/AnsiballZ_package_facts.py <<< 30575 1726867679.04504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/AnsiballZ_package_facts.py" <<< 30575 1726867679.04541: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpe_i3acvf" to remote "/root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/AnsiballZ_package_facts.py" <<< 30575 1726867679.05571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867679.05607: stderr chunk (state=3): >>><<< 30575 1726867679.05610: stdout chunk (state=3): >>><<< 30575 1726867679.05639: done transferring module to remote 30575 1726867679.05648: _low_level_execute_command(): starting 30575 1726867679.05652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/ /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/AnsiballZ_package_facts.py && sleep 0' 30575 1726867679.06074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867679.06079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867679.06082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.06084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867679.06089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.06143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867679.06146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867679.06189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867679.07926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867679.07951: stderr chunk (state=3): >>><<< 30575 1726867679.07954: stdout chunk (state=3): >>><<< 30575 1726867679.07966: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867679.07969: _low_level_execute_command(): starting 30575 1726867679.07972: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/AnsiballZ_package_facts.py && sleep 0' 30575 1726867679.08372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867679.08375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.08379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867679.08382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867679.08384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.08431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867679.08435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867679.08488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867679.52284: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30575 1726867679.52332: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30575 1726867679.52354: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30575 1726867679.52358: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30575 1726867679.52370: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30575 1726867679.52384: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867679.52397: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30575 1726867679.52400: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30575 1726867679.52435: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867679.52440: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30575 1726867679.52448: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30575 1726867679.52474: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30575 1726867679.52488: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30575 1726867679.52501: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867679.54225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867679.54255: stderr chunk (state=3): >>><<< 30575 1726867679.54258: stdout chunk (state=3): >>><<< 30575 1726867679.54298: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867679.56066: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867679.56166: _low_level_execute_command(): starting 30575 1726867679.56169: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867678.9938781-35638-127987587688615/ > /dev/null 2>&1 && sleep 0' 30575 1726867679.56648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867679.56667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.56681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867679.56724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867679.56742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867679.56792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867679.58620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867679.58644: stderr chunk (state=3): >>><<< 30575 1726867679.58647: stdout chunk (state=3): >>><<< 30575 1726867679.58658: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867679.58664: handler run complete 30575 1726867679.59100: variable 'ansible_facts' from source: unknown 30575 1726867679.59372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.60402: variable 'ansible_facts' from source: unknown 30575 1726867679.60652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.61029: attempt loop complete, returning result 30575 1726867679.61038: _execute() done 30575 1726867679.61041: dumping result to json 30575 1726867679.61155: done dumping result, returning 30575 1726867679.61162: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-000000002501] 30575 1726867679.61167: sending task result for task 0affcac9-a3a5-e081-a588-000000002501 30575 1726867679.62527: done sending task result for task 0affcac9-a3a5-e081-a588-000000002501 30575 1726867679.62530: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867679.62621: no more pending results, returning what we have 30575 1726867679.62623: results queue empty 30575 1726867679.62624: checking for any_errors_fatal 30575 1726867679.62628: done checking for any_errors_fatal 30575 1726867679.62629: checking for max_fail_percentage 30575 1726867679.62632: done checking for max_fail_percentage 30575 1726867679.62633: checking to see if all hosts have failed and the running result is not ok 30575 1726867679.62634: done checking to see if all hosts have failed 30575 1726867679.62634: getting the remaining hosts for this loop 30575 1726867679.62635: done getting the remaining hosts for this loop 30575 1726867679.62637: getting the next task for host managed_node3 30575 1726867679.62643: done getting next task for host managed_node3 30575 1726867679.62645: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867679.62649: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867679.62657: getting variables 30575 1726867679.62658: in VariableManager get_vars() 30575 1726867679.62692: Calling all_inventory to load vars for managed_node3 30575 1726867679.62694: Calling groups_inventory to load vars for managed_node3 30575 1726867679.62695: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867679.62702: Calling all_plugins_play to load vars for managed_node3 30575 1726867679.62703: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867679.62705: Calling groups_plugins_play to load vars for managed_node3 30575 1726867679.63382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.64312: done with get_vars() 30575 1726867679.64329: done getting variables 30575 1726867679.64372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:59 -0400 (0:00:00.688) 0:01:55.021 ****** 30575 1726867679.64403: entering _queue_task() for managed_node3/debug 30575 1726867679.64637: worker is 1 (out of 1 available) 30575 1726867679.64650: exiting _queue_task() for managed_node3/debug 30575 1726867679.64663: done queuing things up, now waiting for results queue to drain 30575 1726867679.64665: waiting for pending results... 30575 1726867679.64854: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867679.64945: in run() - task 0affcac9-a3a5-e081-a588-0000000024a5 30575 1726867679.64958: variable 'ansible_search_path' from source: unknown 30575 1726867679.64961: variable 'ansible_search_path' from source: unknown 30575 1726867679.64992: calling self._execute() 30575 1726867679.65074: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.65080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.65088: variable 'omit' from source: magic vars 30575 1726867679.65389: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.65399: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867679.65406: variable 'omit' from source: magic vars 30575 1726867679.65454: variable 'omit' from source: magic vars 30575 1726867679.65527: variable 'network_provider' from source: set_fact 30575 1726867679.65542: variable 'omit' from source: magic vars 30575 1726867679.65573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867679.65601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867679.65616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867679.65633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867679.65642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867679.65667: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867679.65670: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.65673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.65742: Set connection var ansible_pipelining to False 30575 1726867679.65746: Set connection var ansible_shell_type to sh 30575 1726867679.65749: Set connection var ansible_shell_executable to /bin/sh 30575 1726867679.65756: Set connection var ansible_timeout to 10 30575 1726867679.65758: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867679.65766: Set connection var ansible_connection to ssh 30575 1726867679.65786: variable 'ansible_shell_executable' from source: unknown 30575 1726867679.65789: variable 'ansible_connection' from source: unknown 30575 1726867679.65792: variable 'ansible_module_compression' from source: unknown 30575 1726867679.65794: variable 'ansible_shell_type' from source: unknown 30575 1726867679.65796: variable 'ansible_shell_executable' from source: unknown 30575 1726867679.65798: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.65803: variable 'ansible_pipelining' from source: unknown 30575 1726867679.65805: variable 'ansible_timeout' from source: unknown 30575 1726867679.65809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.65912: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867679.65923: variable 'omit' from source: magic vars 30575 1726867679.65928: starting attempt loop 30575 1726867679.65931: running the handler 30575 1726867679.65970: handler run complete 30575 1726867679.65982: attempt loop complete, returning result 30575 1726867679.65985: _execute() done 30575 1726867679.65987: dumping result to json 30575 1726867679.65990: done dumping result, returning 30575 1726867679.66000: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-0000000024a5] 30575 1726867679.66003: sending task result for task 0affcac9-a3a5-e081-a588-0000000024a5 30575 1726867679.66080: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024a5 30575 1726867679.66083: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867679.66159: no more pending results, returning what we have 30575 1726867679.66162: results queue empty 30575 1726867679.66163: checking for any_errors_fatal 30575 1726867679.66168: done checking for any_errors_fatal 30575 1726867679.66169: checking for max_fail_percentage 30575 1726867679.66170: done checking for max_fail_percentage 30575 1726867679.66171: checking to see if all hosts have failed and the running result is not ok 30575 1726867679.66172: done checking to see if all hosts have failed 30575 1726867679.66173: getting the remaining hosts for this loop 30575 1726867679.66174: done getting the remaining hosts for this loop 30575 1726867679.66179: getting the next task for host managed_node3 30575 1726867679.66187: done getting next task for host managed_node3 30575 1726867679.66190: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867679.66195: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867679.66207: getting variables 30575 1726867679.66208: in VariableManager get_vars() 30575 1726867679.66246: Calling all_inventory to load vars for managed_node3 30575 1726867679.66249: Calling groups_inventory to load vars for managed_node3 30575 1726867679.66251: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867679.66259: Calling all_plugins_play to load vars for managed_node3 30575 1726867679.66261: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867679.66264: Calling groups_plugins_play to load vars for managed_node3 30575 1726867679.66997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.67854: done with get_vars() 30575 1726867679.67868: done getting variables 30575 1726867679.67910: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:59 -0400 (0:00:00.035) 0:01:55.056 ****** 30575 1726867679.67939: entering _queue_task() for managed_node3/fail 30575 1726867679.68142: worker is 1 (out of 1 available) 30575 1726867679.68157: exiting _queue_task() for managed_node3/fail 30575 1726867679.68170: done queuing things up, now waiting for results queue to drain 30575 1726867679.68172: waiting for pending results... 30575 1726867679.68354: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867679.68450: in run() - task 0affcac9-a3a5-e081-a588-0000000024a6 30575 1726867679.68460: variable 'ansible_search_path' from source: unknown 30575 1726867679.68465: variable 'ansible_search_path' from source: unknown 30575 1726867679.68494: calling self._execute() 30575 1726867679.68567: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.68571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.68580: variable 'omit' from source: magic vars 30575 1726867679.68850: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.68859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867679.68943: variable 'network_state' from source: role '' defaults 30575 1726867679.68952: Evaluated conditional (network_state != {}): False 30575 1726867679.68955: when evaluation is False, skipping this task 30575 1726867679.68958: _execute() done 30575 1726867679.68961: dumping result to json 30575 1726867679.68964: done dumping result, returning 30575 1726867679.68972: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-0000000024a6] 30575 1726867679.68976: sending task result for task 0affcac9-a3a5-e081-a588-0000000024a6 30575 1726867679.69056: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024a6 30575 1726867679.69059: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867679.69109: no more pending results, returning what we have 30575 1726867679.69113: results queue empty 30575 1726867679.69114: checking for any_errors_fatal 30575 1726867679.69120: done checking for any_errors_fatal 30575 1726867679.69121: checking for max_fail_percentage 30575 1726867679.69123: done checking for max_fail_percentage 30575 1726867679.69124: checking to see if all hosts have failed and the running result is not ok 30575 1726867679.69124: done checking to see if all hosts have failed 30575 1726867679.69125: getting the remaining hosts for this loop 30575 1726867679.69126: done getting the remaining hosts for this loop 30575 1726867679.69130: getting the next task for host managed_node3 30575 1726867679.69138: done getting next task for host managed_node3 30575 1726867679.69141: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867679.69145: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867679.69170: getting variables 30575 1726867679.69172: in VariableManager get_vars() 30575 1726867679.69208: Calling all_inventory to load vars for managed_node3 30575 1726867679.69211: Calling groups_inventory to load vars for managed_node3 30575 1726867679.69212: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867679.69220: Calling all_plugins_play to load vars for managed_node3 30575 1726867679.69222: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867679.69225: Calling groups_plugins_play to load vars for managed_node3 30575 1726867679.70074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.70908: done with get_vars() 30575 1726867679.70923: done getting variables 30575 1726867679.70960: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:59 -0400 (0:00:00.030) 0:01:55.087 ****** 30575 1726867679.70985: entering _queue_task() for managed_node3/fail 30575 1726867679.71175: worker is 1 (out of 1 available) 30575 1726867679.71191: exiting _queue_task() for managed_node3/fail 30575 1726867679.71204: done queuing things up, now waiting for results queue to drain 30575 1726867679.71206: waiting for pending results... 30575 1726867679.71374: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867679.71470: in run() - task 0affcac9-a3a5-e081-a588-0000000024a7 30575 1726867679.71482: variable 'ansible_search_path' from source: unknown 30575 1726867679.71486: variable 'ansible_search_path' from source: unknown 30575 1726867679.71513: calling self._execute() 30575 1726867679.71587: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.71592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.71600: variable 'omit' from source: magic vars 30575 1726867679.71862: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.71872: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867679.71951: variable 'network_state' from source: role '' defaults 30575 1726867679.71960: Evaluated conditional (network_state != {}): False 30575 1726867679.71964: when evaluation is False, skipping this task 30575 1726867679.71966: _execute() done 30575 1726867679.71969: dumping result to json 30575 1726867679.71973: done dumping result, returning 30575 1726867679.71984: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-0000000024a7] 30575 1726867679.71987: sending task result for task 0affcac9-a3a5-e081-a588-0000000024a7 30575 1726867679.72062: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024a7 30575 1726867679.72065: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867679.72122: no more pending results, returning what we have 30575 1726867679.72126: results queue empty 30575 1726867679.72127: checking for any_errors_fatal 30575 1726867679.72132: done checking for any_errors_fatal 30575 1726867679.72133: checking for max_fail_percentage 30575 1726867679.72135: done checking for max_fail_percentage 30575 1726867679.72135: checking to see if all hosts have failed and the running result is not ok 30575 1726867679.72136: done checking to see if all hosts have failed 30575 1726867679.72137: getting the remaining hosts for this loop 30575 1726867679.72138: done getting the remaining hosts for this loop 30575 1726867679.72141: getting the next task for host managed_node3 30575 1726867679.72148: done getting next task for host managed_node3 30575 1726867679.72152: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867679.72156: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867679.72181: getting variables 30575 1726867679.72182: in VariableManager get_vars() 30575 1726867679.72216: Calling all_inventory to load vars for managed_node3 30575 1726867679.72218: Calling groups_inventory to load vars for managed_node3 30575 1726867679.72220: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867679.72228: Calling all_plugins_play to load vars for managed_node3 30575 1726867679.72231: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867679.72233: Calling groups_plugins_play to load vars for managed_node3 30575 1726867679.72947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.73794: done with get_vars() 30575 1726867679.73808: done getting variables 30575 1726867679.73845: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:59 -0400 (0:00:00.028) 0:01:55.116 ****** 30575 1726867679.73868: entering _queue_task() for managed_node3/fail 30575 1726867679.74054: worker is 1 (out of 1 available) 30575 1726867679.74069: exiting _queue_task() for managed_node3/fail 30575 1726867679.74083: done queuing things up, now waiting for results queue to drain 30575 1726867679.74085: waiting for pending results... 30575 1726867679.74250: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867679.74335: in run() - task 0affcac9-a3a5-e081-a588-0000000024a8 30575 1726867679.74345: variable 'ansible_search_path' from source: unknown 30575 1726867679.74349: variable 'ansible_search_path' from source: unknown 30575 1726867679.74374: calling self._execute() 30575 1726867679.74447: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.74451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.74461: variable 'omit' from source: magic vars 30575 1726867679.74712: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.74723: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867679.74838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867679.76335: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867679.76641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867679.76668: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867679.76694: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867679.76716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867679.76772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.76795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.76815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.76842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.76853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.76925: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.76933: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867679.77005: variable 'ansible_distribution' from source: facts 30575 1726867679.77009: variable '__network_rh_distros' from source: role '' defaults 30575 1726867679.77019: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867679.77170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.77189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.77206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.77232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.77244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.77280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.77296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.77312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.77338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.77348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.77380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.77398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.77414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.77439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.77449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.77629: variable 'network_connections' from source: include params 30575 1726867679.77637: variable 'interface' from source: play vars 30575 1726867679.77687: variable 'interface' from source: play vars 30575 1726867679.77693: variable 'network_state' from source: role '' defaults 30575 1726867679.77739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867679.77847: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867679.77874: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867679.77899: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867679.77923: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867679.77951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867679.77966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867679.77989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.78007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867679.78028: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867679.78031: when evaluation is False, skipping this task 30575 1726867679.78034: _execute() done 30575 1726867679.78036: dumping result to json 30575 1726867679.78038: done dumping result, returning 30575 1726867679.78046: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-0000000024a8] 30575 1726867679.78051: sending task result for task 0affcac9-a3a5-e081-a588-0000000024a8 30575 1726867679.78133: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024a8 30575 1726867679.78135: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867679.78180: no more pending results, returning what we have 30575 1726867679.78184: results queue empty 30575 1726867679.78185: checking for any_errors_fatal 30575 1726867679.78191: done checking for any_errors_fatal 30575 1726867679.78192: checking for max_fail_percentage 30575 1726867679.78194: done checking for max_fail_percentage 30575 1726867679.78195: checking to see if all hosts have failed and the running result is not ok 30575 1726867679.78196: done checking to see if all hosts have failed 30575 1726867679.78197: getting the remaining hosts for this loop 30575 1726867679.78199: done getting the remaining hosts for this loop 30575 1726867679.78202: getting the next task for host managed_node3 30575 1726867679.78211: done getting next task for host managed_node3 30575 1726867679.78215: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867679.78222: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867679.78251: getting variables 30575 1726867679.78253: in VariableManager get_vars() 30575 1726867679.78299: Calling all_inventory to load vars for managed_node3 30575 1726867679.78301: Calling groups_inventory to load vars for managed_node3 30575 1726867679.78303: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867679.78311: Calling all_plugins_play to load vars for managed_node3 30575 1726867679.78314: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867679.78319: Calling groups_plugins_play to load vars for managed_node3 30575 1726867679.79235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.80079: done with get_vars() 30575 1726867679.80094: done getting variables 30575 1726867679.80137: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:59 -0400 (0:00:00.062) 0:01:55.179 ****** 30575 1726867679.80160: entering _queue_task() for managed_node3/dnf 30575 1726867679.80382: worker is 1 (out of 1 available) 30575 1726867679.80395: exiting _queue_task() for managed_node3/dnf 30575 1726867679.80409: done queuing things up, now waiting for results queue to drain 30575 1726867679.80410: waiting for pending results... 30575 1726867679.80587: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867679.80694: in run() - task 0affcac9-a3a5-e081-a588-0000000024a9 30575 1726867679.80704: variable 'ansible_search_path' from source: unknown 30575 1726867679.80707: variable 'ansible_search_path' from source: unknown 30575 1726867679.80740: calling self._execute() 30575 1726867679.80815: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.80822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.80827: variable 'omit' from source: magic vars 30575 1726867679.81099: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.81108: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867679.81239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867679.82733: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867679.82785: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867679.82812: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867679.82840: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867679.82859: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867679.82921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.82940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.82958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.82985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.82996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.83074: variable 'ansible_distribution' from source: facts 30575 1726867679.83080: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.83091: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867679.83163: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867679.83245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.83264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.83281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.83306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.83319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.83345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.83366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.83381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.83405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.83415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.83442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.83458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.83476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.83502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.83512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.83612: variable 'network_connections' from source: include params 30575 1726867679.83684: variable 'interface' from source: play vars 30575 1726867679.83689: variable 'interface' from source: play vars 30575 1726867679.83715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867679.83824: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867679.83850: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867679.83872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867679.83895: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867679.83936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867679.83953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867679.83974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.83994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867679.84029: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867679.84176: variable 'network_connections' from source: include params 30575 1726867679.84181: variable 'interface' from source: play vars 30575 1726867679.84227: variable 'interface' from source: play vars 30575 1726867679.84244: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867679.84248: when evaluation is False, skipping this task 30575 1726867679.84250: _execute() done 30575 1726867679.84253: dumping result to json 30575 1726867679.84257: done dumping result, returning 30575 1726867679.84263: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000024a9] 30575 1726867679.84268: sending task result for task 0affcac9-a3a5-e081-a588-0000000024a9 30575 1726867679.84351: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024a9 30575 1726867679.84354: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867679.84402: no more pending results, returning what we have 30575 1726867679.84406: results queue empty 30575 1726867679.84407: checking for any_errors_fatal 30575 1726867679.84414: done checking for any_errors_fatal 30575 1726867679.84414: checking for max_fail_percentage 30575 1726867679.84416: done checking for max_fail_percentage 30575 1726867679.84419: checking to see if all hosts have failed and the running result is not ok 30575 1726867679.84420: done checking to see if all hosts have failed 30575 1726867679.84421: getting the remaining hosts for this loop 30575 1726867679.84422: done getting the remaining hosts for this loop 30575 1726867679.84425: getting the next task for host managed_node3 30575 1726867679.84434: done getting next task for host managed_node3 30575 1726867679.84437: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867679.84442: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867679.84467: getting variables 30575 1726867679.84468: in VariableManager get_vars() 30575 1726867679.84510: Calling all_inventory to load vars for managed_node3 30575 1726867679.84512: Calling groups_inventory to load vars for managed_node3 30575 1726867679.84515: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867679.84525: Calling all_plugins_play to load vars for managed_node3 30575 1726867679.84528: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867679.84530: Calling groups_plugins_play to load vars for managed_node3 30575 1726867679.85293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867679.86156: done with get_vars() 30575 1726867679.86171: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867679.86226: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:59 -0400 (0:00:00.060) 0:01:55.240 ****** 30575 1726867679.86248: entering _queue_task() for managed_node3/yum 30575 1726867679.86460: worker is 1 (out of 1 available) 30575 1726867679.86474: exiting _queue_task() for managed_node3/yum 30575 1726867679.86489: done queuing things up, now waiting for results queue to drain 30575 1726867679.86491: waiting for pending results... 30575 1726867679.86662: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867679.86760: in run() - task 0affcac9-a3a5-e081-a588-0000000024aa 30575 1726867679.86771: variable 'ansible_search_path' from source: unknown 30575 1726867679.86775: variable 'ansible_search_path' from source: unknown 30575 1726867679.86807: calling self._execute() 30575 1726867679.86887: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867679.86891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867679.86900: variable 'omit' from source: magic vars 30575 1726867679.87180: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.87189: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867679.87312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867679.93840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867679.93890: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867679.93918: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867679.93944: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867679.93963: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867679.94015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867679.94037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867679.94054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867679.94081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867679.94092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867679.94160: variable 'ansible_distribution_major_version' from source: facts 30575 1726867679.94171: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867679.94174: when evaluation is False, skipping this task 30575 1726867679.94176: _execute() done 30575 1726867679.94180: dumping result to json 30575 1726867679.94182: done dumping result, returning 30575 1726867679.94190: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000024aa] 30575 1726867679.94193: sending task result for task 0affcac9-a3a5-e081-a588-0000000024aa 30575 1726867679.94276: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024aa skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867679.94326: no more pending results, returning what we have 30575 1726867679.94330: results queue empty 30575 1726867679.94331: checking for any_errors_fatal 30575 1726867679.94337: done checking for any_errors_fatal 30575 1726867679.94338: checking for max_fail_percentage 30575 1726867679.94340: done checking for max_fail_percentage 30575 1726867679.94341: checking to see if all hosts have failed and the running result is not ok 30575 1726867679.94341: done checking to see if all hosts have failed 30575 1726867679.94342: getting the remaining hosts for this loop 30575 1726867679.94343: done getting the remaining hosts for this loop 30575 1726867679.94347: getting the next task for host managed_node3 30575 1726867679.94355: done getting next task for host managed_node3 30575 1726867679.94359: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867679.94363: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867679.94391: getting variables 30575 1726867679.94393: in VariableManager get_vars() 30575 1726867679.94436: Calling all_inventory to load vars for managed_node3 30575 1726867679.94438: Calling groups_inventory to load vars for managed_node3 30575 1726867679.94440: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867679.94448: Calling all_plugins_play to load vars for managed_node3 30575 1726867679.94451: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867679.94453: Calling groups_plugins_play to load vars for managed_node3 30575 1726867679.94990: WORKER PROCESS EXITING 30575 1726867679.99460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.00311: done with get_vars() 30575 1726867680.00331: done getting variables 30575 1726867680.00367: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:28:00 -0400 (0:00:00.141) 0:01:55.381 ****** 30575 1726867680.00390: entering _queue_task() for managed_node3/fail 30575 1726867680.00675: worker is 1 (out of 1 available) 30575 1726867680.00690: exiting _queue_task() for managed_node3/fail 30575 1726867680.00704: done queuing things up, now waiting for results queue to drain 30575 1726867680.00707: waiting for pending results... 30575 1726867680.00907: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867680.01027: in run() - task 0affcac9-a3a5-e081-a588-0000000024ab 30575 1726867680.01040: variable 'ansible_search_path' from source: unknown 30575 1726867680.01044: variable 'ansible_search_path' from source: unknown 30575 1726867680.01075: calling self._execute() 30575 1726867680.01156: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.01162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.01170: variable 'omit' from source: magic vars 30575 1726867680.01456: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.01465: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.01559: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867680.01694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867680.03212: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867680.03270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867680.03298: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867680.03324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867680.03346: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867680.03406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.03428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.03445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.03474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.03487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.03521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.03535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.03552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.03581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.03592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.03622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.03636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.03652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.03679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.03690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.03804: variable 'network_connections' from source: include params 30575 1726867680.03814: variable 'interface' from source: play vars 30575 1726867680.03861: variable 'interface' from source: play vars 30575 1726867680.03915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867680.04033: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867680.04061: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867680.04086: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867680.04107: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867680.04142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867680.04157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867680.04175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.04194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867680.04234: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867680.04382: variable 'network_connections' from source: include params 30575 1726867680.04386: variable 'interface' from source: play vars 30575 1726867680.04429: variable 'interface' from source: play vars 30575 1726867680.04451: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867680.04454: when evaluation is False, skipping this task 30575 1726867680.04457: _execute() done 30575 1726867680.04459: dumping result to json 30575 1726867680.04461: done dumping result, returning 30575 1726867680.04467: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000024ab] 30575 1726867680.04473: sending task result for task 0affcac9-a3a5-e081-a588-0000000024ab 30575 1726867680.04563: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024ab 30575 1726867680.04566: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867680.04616: no more pending results, returning what we have 30575 1726867680.04622: results queue empty 30575 1726867680.04622: checking for any_errors_fatal 30575 1726867680.04632: done checking for any_errors_fatal 30575 1726867680.04632: checking for max_fail_percentage 30575 1726867680.04634: done checking for max_fail_percentage 30575 1726867680.04635: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.04636: done checking to see if all hosts have failed 30575 1726867680.04637: getting the remaining hosts for this loop 30575 1726867680.04638: done getting the remaining hosts for this loop 30575 1726867680.04642: getting the next task for host managed_node3 30575 1726867680.04651: done getting next task for host managed_node3 30575 1726867680.04654: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867680.04659: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.04692: getting variables 30575 1726867680.04693: in VariableManager get_vars() 30575 1726867680.04743: Calling all_inventory to load vars for managed_node3 30575 1726867680.04746: Calling groups_inventory to load vars for managed_node3 30575 1726867680.04748: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.04756: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.04759: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.04761: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.05580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.06565: done with get_vars() 30575 1726867680.06583: done getting variables 30575 1726867680.06628: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:28:00 -0400 (0:00:00.062) 0:01:55.444 ****** 30575 1726867680.06654: entering _queue_task() for managed_node3/package 30575 1726867680.06892: worker is 1 (out of 1 available) 30575 1726867680.06907: exiting _queue_task() for managed_node3/package 30575 1726867680.06922: done queuing things up, now waiting for results queue to drain 30575 1726867680.06924: waiting for pending results... 30575 1726867680.07112: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867680.07219: in run() - task 0affcac9-a3a5-e081-a588-0000000024ac 30575 1726867680.07232: variable 'ansible_search_path' from source: unknown 30575 1726867680.07236: variable 'ansible_search_path' from source: unknown 30575 1726867680.07269: calling self._execute() 30575 1726867680.07349: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.07354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.07363: variable 'omit' from source: magic vars 30575 1726867680.07642: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.07651: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.07781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867680.07968: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867680.08001: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867680.08028: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867680.08082: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867680.08167: variable 'network_packages' from source: role '' defaults 30575 1726867680.08242: variable '__network_provider_setup' from source: role '' defaults 30575 1726867680.08248: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867680.08295: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867680.08302: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867680.08347: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867680.08455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867680.09767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867680.09810: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867680.09836: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867680.09861: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867680.09883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867680.09948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.09968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.09990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.10018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.10028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.10058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.10078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.10097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.10184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.10189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.10270: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867680.10340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.10357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.10373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.10413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.10420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.10471: variable 'ansible_python' from source: facts 30575 1726867680.10485: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867680.10542: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867680.10597: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867680.10681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.10698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.10715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.10745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.10755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.10788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.10808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.10826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.10853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.10864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.10960: variable 'network_connections' from source: include params 30575 1726867680.10966: variable 'interface' from source: play vars 30575 1726867680.11035: variable 'interface' from source: play vars 30575 1726867680.11086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867680.11105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867680.11127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.11147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867680.11185: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867680.11358: variable 'network_connections' from source: include params 30575 1726867680.11361: variable 'interface' from source: play vars 30575 1726867680.11434: variable 'interface' from source: play vars 30575 1726867680.11456: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867680.11510: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867680.11700: variable 'network_connections' from source: include params 30575 1726867680.11704: variable 'interface' from source: play vars 30575 1726867680.11749: variable 'interface' from source: play vars 30575 1726867680.11765: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867680.11821: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867680.12009: variable 'network_connections' from source: include params 30575 1726867680.12012: variable 'interface' from source: play vars 30575 1726867680.12060: variable 'interface' from source: play vars 30575 1726867680.12097: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867680.12139: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867680.12145: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867680.12188: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867680.12321: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867680.12610: variable 'network_connections' from source: include params 30575 1726867680.12614: variable 'interface' from source: play vars 30575 1726867680.12655: variable 'interface' from source: play vars 30575 1726867680.12661: variable 'ansible_distribution' from source: facts 30575 1726867680.12664: variable '__network_rh_distros' from source: role '' defaults 30575 1726867680.12670: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.12684: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867680.12788: variable 'ansible_distribution' from source: facts 30575 1726867680.12791: variable '__network_rh_distros' from source: role '' defaults 30575 1726867680.12797: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.12809: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867680.12913: variable 'ansible_distribution' from source: facts 30575 1726867680.12919: variable '__network_rh_distros' from source: role '' defaults 30575 1726867680.12922: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.12945: variable 'network_provider' from source: set_fact 30575 1726867680.12957: variable 'ansible_facts' from source: unknown 30575 1726867680.13398: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867680.13402: when evaluation is False, skipping this task 30575 1726867680.13404: _execute() done 30575 1726867680.13407: dumping result to json 30575 1726867680.13409: done dumping result, returning 30575 1726867680.13419: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-0000000024ac] 30575 1726867680.13422: sending task result for task 0affcac9-a3a5-e081-a588-0000000024ac 30575 1726867680.13512: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024ac 30575 1726867680.13515: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867680.13584: no more pending results, returning what we have 30575 1726867680.13588: results queue empty 30575 1726867680.13588: checking for any_errors_fatal 30575 1726867680.13594: done checking for any_errors_fatal 30575 1726867680.13595: checking for max_fail_percentage 30575 1726867680.13596: done checking for max_fail_percentage 30575 1726867680.13597: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.13598: done checking to see if all hosts have failed 30575 1726867680.13599: getting the remaining hosts for this loop 30575 1726867680.13600: done getting the remaining hosts for this loop 30575 1726867680.13604: getting the next task for host managed_node3 30575 1726867680.13613: done getting next task for host managed_node3 30575 1726867680.13618: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867680.13623: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.13651: getting variables 30575 1726867680.13652: in VariableManager get_vars() 30575 1726867680.13700: Calling all_inventory to load vars for managed_node3 30575 1726867680.13702: Calling groups_inventory to load vars for managed_node3 30575 1726867680.13704: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.13713: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.13716: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.13721: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.14513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.15386: done with get_vars() 30575 1726867680.15404: done getting variables 30575 1726867680.15446: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:28:00 -0400 (0:00:00.088) 0:01:55.532 ****** 30575 1726867680.15472: entering _queue_task() for managed_node3/package 30575 1726867680.15711: worker is 1 (out of 1 available) 30575 1726867680.15727: exiting _queue_task() for managed_node3/package 30575 1726867680.15741: done queuing things up, now waiting for results queue to drain 30575 1726867680.15743: waiting for pending results... 30575 1726867680.15930: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867680.16026: in run() - task 0affcac9-a3a5-e081-a588-0000000024ad 30575 1726867680.16038: variable 'ansible_search_path' from source: unknown 30575 1726867680.16041: variable 'ansible_search_path' from source: unknown 30575 1726867680.16068: calling self._execute() 30575 1726867680.16160: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.16164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.16173: variable 'omit' from source: magic vars 30575 1726867680.16460: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.16469: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.16557: variable 'network_state' from source: role '' defaults 30575 1726867680.16567: Evaluated conditional (network_state != {}): False 30575 1726867680.16570: when evaluation is False, skipping this task 30575 1726867680.16573: _execute() done 30575 1726867680.16576: dumping result to json 30575 1726867680.16580: done dumping result, returning 30575 1726867680.16589: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000024ad] 30575 1726867680.16593: sending task result for task 0affcac9-a3a5-e081-a588-0000000024ad 30575 1726867680.16684: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024ad 30575 1726867680.16687: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867680.16763: no more pending results, returning what we have 30575 1726867680.16766: results queue empty 30575 1726867680.16766: checking for any_errors_fatal 30575 1726867680.16771: done checking for any_errors_fatal 30575 1726867680.16771: checking for max_fail_percentage 30575 1726867680.16773: done checking for max_fail_percentage 30575 1726867680.16774: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.16775: done checking to see if all hosts have failed 30575 1726867680.16775: getting the remaining hosts for this loop 30575 1726867680.16778: done getting the remaining hosts for this loop 30575 1726867680.16781: getting the next task for host managed_node3 30575 1726867680.16789: done getting next task for host managed_node3 30575 1726867680.16792: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867680.16796: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.16819: getting variables 30575 1726867680.16821: in VariableManager get_vars() 30575 1726867680.16856: Calling all_inventory to load vars for managed_node3 30575 1726867680.16859: Calling groups_inventory to load vars for managed_node3 30575 1726867680.16861: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.16869: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.16872: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.16874: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.17750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.18592: done with get_vars() 30575 1726867680.18607: done getting variables 30575 1726867680.18646: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:28:00 -0400 (0:00:00.031) 0:01:55.564 ****** 30575 1726867680.18670: entering _queue_task() for managed_node3/package 30575 1726867680.18879: worker is 1 (out of 1 available) 30575 1726867680.18893: exiting _queue_task() for managed_node3/package 30575 1726867680.18904: done queuing things up, now waiting for results queue to drain 30575 1726867680.18906: waiting for pending results... 30575 1726867680.19089: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867680.19191: in run() - task 0affcac9-a3a5-e081-a588-0000000024ae 30575 1726867680.19202: variable 'ansible_search_path' from source: unknown 30575 1726867680.19205: variable 'ansible_search_path' from source: unknown 30575 1726867680.19237: calling self._execute() 30575 1726867680.19311: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.19315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.19326: variable 'omit' from source: magic vars 30575 1726867680.19596: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.19605: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.19694: variable 'network_state' from source: role '' defaults 30575 1726867680.19702: Evaluated conditional (network_state != {}): False 30575 1726867680.19705: when evaluation is False, skipping this task 30575 1726867680.19708: _execute() done 30575 1726867680.19710: dumping result to json 30575 1726867680.19715: done dumping result, returning 30575 1726867680.19725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-0000000024ae] 30575 1726867680.19730: sending task result for task 0affcac9-a3a5-e081-a588-0000000024ae 30575 1726867680.19819: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024ae 30575 1726867680.19822: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867680.19865: no more pending results, returning what we have 30575 1726867680.19869: results queue empty 30575 1726867680.19869: checking for any_errors_fatal 30575 1726867680.19875: done checking for any_errors_fatal 30575 1726867680.19876: checking for max_fail_percentage 30575 1726867680.19880: done checking for max_fail_percentage 30575 1726867680.19881: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.19882: done checking to see if all hosts have failed 30575 1726867680.19882: getting the remaining hosts for this loop 30575 1726867680.19884: done getting the remaining hosts for this loop 30575 1726867680.19887: getting the next task for host managed_node3 30575 1726867680.19894: done getting next task for host managed_node3 30575 1726867680.19898: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867680.19902: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.19924: getting variables 30575 1726867680.19926: in VariableManager get_vars() 30575 1726867680.19961: Calling all_inventory to load vars for managed_node3 30575 1726867680.19964: Calling groups_inventory to load vars for managed_node3 30575 1726867680.19965: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.19973: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.19975: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.19983: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.20709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.21569: done with get_vars() 30575 1726867680.21584: done getting variables 30575 1726867680.21625: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:28:00 -0400 (0:00:00.029) 0:01:55.594 ****** 30575 1726867680.21649: entering _queue_task() for managed_node3/service 30575 1726867680.21840: worker is 1 (out of 1 available) 30575 1726867680.21854: exiting _queue_task() for managed_node3/service 30575 1726867680.21866: done queuing things up, now waiting for results queue to drain 30575 1726867680.21868: waiting for pending results... 30575 1726867680.22042: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867680.22141: in run() - task 0affcac9-a3a5-e081-a588-0000000024af 30575 1726867680.22151: variable 'ansible_search_path' from source: unknown 30575 1726867680.22154: variable 'ansible_search_path' from source: unknown 30575 1726867680.22181: calling self._execute() 30575 1726867680.22258: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.22261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.22268: variable 'omit' from source: magic vars 30575 1726867680.22532: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.22540: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.22625: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867680.22752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867680.24506: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867680.24551: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867680.24588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867680.24615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867680.24638: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867680.24695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.24718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.24738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.24763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.24774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.24807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.24829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.24845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.24870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.24880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.24908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.24927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.24944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.24968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.24980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.25092: variable 'network_connections' from source: include params 30575 1726867680.25102: variable 'interface' from source: play vars 30575 1726867680.25153: variable 'interface' from source: play vars 30575 1726867680.25204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867680.25315: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867680.25346: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867680.25369: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867680.25402: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867680.25436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867680.25451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867680.25472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.25490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867680.25531: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867680.25680: variable 'network_connections' from source: include params 30575 1726867680.25683: variable 'interface' from source: play vars 30575 1726867680.25730: variable 'interface' from source: play vars 30575 1726867680.25747: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867680.25751: when evaluation is False, skipping this task 30575 1726867680.25753: _execute() done 30575 1726867680.25756: dumping result to json 30575 1726867680.25760: done dumping result, returning 30575 1726867680.25767: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-0000000024af] 30575 1726867680.25772: sending task result for task 0affcac9-a3a5-e081-a588-0000000024af 30575 1726867680.25858: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024af 30575 1726867680.25868: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867680.25916: no more pending results, returning what we have 30575 1726867680.25919: results queue empty 30575 1726867680.25920: checking for any_errors_fatal 30575 1726867680.25927: done checking for any_errors_fatal 30575 1726867680.25927: checking for max_fail_percentage 30575 1726867680.25929: done checking for max_fail_percentage 30575 1726867680.25930: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.25931: done checking to see if all hosts have failed 30575 1726867680.25932: getting the remaining hosts for this loop 30575 1726867680.25933: done getting the remaining hosts for this loop 30575 1726867680.25937: getting the next task for host managed_node3 30575 1726867680.25946: done getting next task for host managed_node3 30575 1726867680.25950: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867680.25954: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.25986: getting variables 30575 1726867680.25987: in VariableManager get_vars() 30575 1726867680.26034: Calling all_inventory to load vars for managed_node3 30575 1726867680.26036: Calling groups_inventory to load vars for managed_node3 30575 1726867680.26038: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.26047: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.26049: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.26052: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.27009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.27866: done with get_vars() 30575 1726867680.27884: done getting variables 30575 1726867680.27926: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:28:00 -0400 (0:00:00.062) 0:01:55.657 ****** 30575 1726867680.27950: entering _queue_task() for managed_node3/service 30575 1726867680.28190: worker is 1 (out of 1 available) 30575 1726867680.28204: exiting _queue_task() for managed_node3/service 30575 1726867680.28217: done queuing things up, now waiting for results queue to drain 30575 1726867680.28218: waiting for pending results... 30575 1726867680.28410: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867680.28510: in run() - task 0affcac9-a3a5-e081-a588-0000000024b0 30575 1726867680.28524: variable 'ansible_search_path' from source: unknown 30575 1726867680.28528: variable 'ansible_search_path' from source: unknown 30575 1726867680.28557: calling self._execute() 30575 1726867680.28645: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.28649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.28659: variable 'omit' from source: magic vars 30575 1726867680.28944: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.28953: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.29073: variable 'network_provider' from source: set_fact 30575 1726867680.29076: variable 'network_state' from source: role '' defaults 30575 1726867680.29089: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867680.29095: variable 'omit' from source: magic vars 30575 1726867680.29139: variable 'omit' from source: magic vars 30575 1726867680.29158: variable 'network_service_name' from source: role '' defaults 30575 1726867680.29208: variable 'network_service_name' from source: role '' defaults 30575 1726867680.29284: variable '__network_provider_setup' from source: role '' defaults 30575 1726867680.29288: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867680.29337: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867680.29345: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867680.29389: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867680.29541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867680.31026: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867680.31082: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867680.31109: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867680.31137: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867680.31156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867680.31216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.31239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.31256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.31288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.31299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.31333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.31349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.31365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.31395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.31405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.31557: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867680.31635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.31651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.31668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.31694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.31710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.31766: variable 'ansible_python' from source: facts 30575 1726867680.31780: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867680.31838: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867680.31891: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867680.31975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.31994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.32010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.32040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.32050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.32084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.32104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.32121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.32148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.32159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.32254: variable 'network_connections' from source: include params 30575 1726867680.32258: variable 'interface' from source: play vars 30575 1726867680.32310: variable 'interface' from source: play vars 30575 1726867680.32386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867680.32514: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867680.32550: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867680.32585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867680.32616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867680.32659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867680.32682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867680.32707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.32733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867680.32769: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867680.32953: variable 'network_connections' from source: include params 30575 1726867680.32957: variable 'interface' from source: play vars 30575 1726867680.33011: variable 'interface' from source: play vars 30575 1726867680.33036: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867680.33089: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867680.33272: variable 'network_connections' from source: include params 30575 1726867680.33276: variable 'interface' from source: play vars 30575 1726867680.33327: variable 'interface' from source: play vars 30575 1726867680.33343: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867680.33400: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867680.33584: variable 'network_connections' from source: include params 30575 1726867680.33587: variable 'interface' from source: play vars 30575 1726867680.33635: variable 'interface' from source: play vars 30575 1726867680.33674: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867680.33716: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867680.33722: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867680.33763: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867680.33895: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867680.34203: variable 'network_connections' from source: include params 30575 1726867680.34206: variable 'interface' from source: play vars 30575 1726867680.34251: variable 'interface' from source: play vars 30575 1726867680.34257: variable 'ansible_distribution' from source: facts 30575 1726867680.34259: variable '__network_rh_distros' from source: role '' defaults 30575 1726867680.34266: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.34276: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867680.34389: variable 'ansible_distribution' from source: facts 30575 1726867680.34392: variable '__network_rh_distros' from source: role '' defaults 30575 1726867680.34396: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.34407: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867680.34516: variable 'ansible_distribution' from source: facts 30575 1726867680.34522: variable '__network_rh_distros' from source: role '' defaults 30575 1726867680.34525: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.34550: variable 'network_provider' from source: set_fact 30575 1726867680.34567: variable 'omit' from source: magic vars 30575 1726867680.34588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867680.34607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867680.34655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867680.34659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867680.34661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867680.34663: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867680.34666: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.34668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.34736: Set connection var ansible_pipelining to False 30575 1726867680.34739: Set connection var ansible_shell_type to sh 30575 1726867680.34744: Set connection var ansible_shell_executable to /bin/sh 30575 1726867680.34749: Set connection var ansible_timeout to 10 30575 1726867680.34754: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867680.34761: Set connection var ansible_connection to ssh 30575 1726867680.34785: variable 'ansible_shell_executable' from source: unknown 30575 1726867680.34789: variable 'ansible_connection' from source: unknown 30575 1726867680.34791: variable 'ansible_module_compression' from source: unknown 30575 1726867680.34793: variable 'ansible_shell_type' from source: unknown 30575 1726867680.34796: variable 'ansible_shell_executable' from source: unknown 30575 1726867680.34798: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.34802: variable 'ansible_pipelining' from source: unknown 30575 1726867680.34804: variable 'ansible_timeout' from source: unknown 30575 1726867680.34808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.34879: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867680.34895: variable 'omit' from source: magic vars 30575 1726867680.34900: starting attempt loop 30575 1726867680.34903: running the handler 30575 1726867680.34956: variable 'ansible_facts' from source: unknown 30575 1726867680.35436: _low_level_execute_command(): starting 30575 1726867680.35442: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867680.35931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867680.35935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.35939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867680.35942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.35996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867680.35999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867680.36001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867680.36061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867680.37743: stdout chunk (state=3): >>>/root <<< 30575 1726867680.37842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867680.37867: stderr chunk (state=3): >>><<< 30575 1726867680.37870: stdout chunk (state=3): >>><<< 30575 1726867680.37891: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867680.37900: _low_level_execute_command(): starting 30575 1726867680.37905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023 `" && echo ansible-tmp-1726867680.3789-35660-137556255184023="` echo /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023 `" ) && sleep 0' 30575 1726867680.38304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867680.38308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.38327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.38372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867680.38375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867680.38428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867680.40315: stdout chunk (state=3): >>>ansible-tmp-1726867680.3789-35660-137556255184023=/root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023 <<< 30575 1726867680.40423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867680.40444: stderr chunk (state=3): >>><<< 30575 1726867680.40447: stdout chunk (state=3): >>><<< 30575 1726867680.40462: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867680.3789-35660-137556255184023=/root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867680.40486: variable 'ansible_module_compression' from source: unknown 30575 1726867680.40525: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867680.40572: variable 'ansible_facts' from source: unknown 30575 1726867680.40706: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/AnsiballZ_systemd.py 30575 1726867680.40805: Sending initial data 30575 1726867680.40809: Sent initial data (153 bytes) 30575 1726867680.41222: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867680.41225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867680.41231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867680.41233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867680.41235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.41282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867680.41285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867680.41334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867680.42866: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867680.42872: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867680.42910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867680.42958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpopr_y11r /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/AnsiballZ_systemd.py <<< 30575 1726867680.42962: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/AnsiballZ_systemd.py" <<< 30575 1726867680.42998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpopr_y11r" to remote "/root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/AnsiballZ_systemd.py" <<< 30575 1726867680.44051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867680.44086: stderr chunk (state=3): >>><<< 30575 1726867680.44089: stdout chunk (state=3): >>><<< 30575 1726867680.44123: done transferring module to remote 30575 1726867680.44131: _low_level_execute_command(): starting 30575 1726867680.44134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/ /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/AnsiballZ_systemd.py && sleep 0' 30575 1726867680.44545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867680.44548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867680.44551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.44553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867680.44555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867680.44556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.44611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867680.44614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867680.44655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867680.46396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867680.46416: stderr chunk (state=3): >>><<< 30575 1726867680.46421: stdout chunk (state=3): >>><<< 30575 1726867680.46434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867680.46437: _low_level_execute_command(): starting 30575 1726867680.46440: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/AnsiballZ_systemd.py && sleep 0' 30575 1726867680.46814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867680.46821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.46833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.46886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867680.46893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867680.46939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867680.76134: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10575872", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326451712", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "2019750000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30575 1726867680.76157: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system<<< 30575 1726867680.76167: stdout chunk (state=3): >>>.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867680.78000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867680.78031: stderr chunk (state=3): >>><<< 30575 1726867680.78034: stdout chunk (state=3): >>><<< 30575 1726867680.78054: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10575872", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326451712", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "2019750000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867680.78181: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867680.78196: _low_level_execute_command(): starting 30575 1726867680.78201: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867680.3789-35660-137556255184023/ > /dev/null 2>&1 && sleep 0' 30575 1726867680.78641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867680.78645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.78656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867680.78718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867680.78725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867680.78727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867680.78768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867680.80595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867680.80625: stderr chunk (state=3): >>><<< 30575 1726867680.80628: stdout chunk (state=3): >>><<< 30575 1726867680.80642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867680.80652: handler run complete 30575 1726867680.80693: attempt loop complete, returning result 30575 1726867680.80696: _execute() done 30575 1726867680.80698: dumping result to json 30575 1726867680.80710: done dumping result, returning 30575 1726867680.80762: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-0000000024b0] 30575 1726867680.80765: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b0 30575 1726867680.81099: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b0 30575 1726867680.81102: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867680.81169: no more pending results, returning what we have 30575 1726867680.81172: results queue empty 30575 1726867680.81173: checking for any_errors_fatal 30575 1726867680.81181: done checking for any_errors_fatal 30575 1726867680.81182: checking for max_fail_percentage 30575 1726867680.81183: done checking for max_fail_percentage 30575 1726867680.81184: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.81185: done checking to see if all hosts have failed 30575 1726867680.81185: getting the remaining hosts for this loop 30575 1726867680.81187: done getting the remaining hosts for this loop 30575 1726867680.81190: getting the next task for host managed_node3 30575 1726867680.81198: done getting next task for host managed_node3 30575 1726867680.81201: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867680.81206: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.81226: getting variables 30575 1726867680.81228: in VariableManager get_vars() 30575 1726867680.81267: Calling all_inventory to load vars for managed_node3 30575 1726867680.81270: Calling groups_inventory to load vars for managed_node3 30575 1726867680.81272: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.81283: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.81286: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.81288: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.82113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.83142: done with get_vars() 30575 1726867680.83161: done getting variables 30575 1726867680.83207: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:28:00 -0400 (0:00:00.552) 0:01:56.209 ****** 30575 1726867680.83241: entering _queue_task() for managed_node3/service 30575 1726867680.83522: worker is 1 (out of 1 available) 30575 1726867680.83539: exiting _queue_task() for managed_node3/service 30575 1726867680.83552: done queuing things up, now waiting for results queue to drain 30575 1726867680.83553: waiting for pending results... 30575 1726867680.83750: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867680.83857: in run() - task 0affcac9-a3a5-e081-a588-0000000024b1 30575 1726867680.83869: variable 'ansible_search_path' from source: unknown 30575 1726867680.83873: variable 'ansible_search_path' from source: unknown 30575 1726867680.83906: calling self._execute() 30575 1726867680.83987: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.83991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.84002: variable 'omit' from source: magic vars 30575 1726867680.84286: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.84295: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.84380: variable 'network_provider' from source: set_fact 30575 1726867680.84383: Evaluated conditional (network_provider == "nm"): True 30575 1726867680.84452: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867680.84513: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867680.84634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867680.86126: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867680.86180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867680.86208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867680.86235: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867680.86256: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867680.86331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.86352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.86370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.86401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.86412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.86446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.86464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.86482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.86511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.86612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.86615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.86620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.86623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.86625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.86627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.86709: variable 'network_connections' from source: include params 30575 1726867680.86722: variable 'interface' from source: play vars 30575 1726867680.86773: variable 'interface' from source: play vars 30575 1726867680.86829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867680.86939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867680.86966: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867680.86991: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867680.87013: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867680.87046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867680.87065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867680.87083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.87101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867680.87142: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867680.87304: variable 'network_connections' from source: include params 30575 1726867680.87308: variable 'interface' from source: play vars 30575 1726867680.87353: variable 'interface' from source: play vars 30575 1726867680.87383: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867680.87386: when evaluation is False, skipping this task 30575 1726867680.87389: _execute() done 30575 1726867680.87391: dumping result to json 30575 1726867680.87393: done dumping result, returning 30575 1726867680.87401: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-0000000024b1] 30575 1726867680.87412: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b1 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867680.87542: no more pending results, returning what we have 30575 1726867680.87545: results queue empty 30575 1726867680.87546: checking for any_errors_fatal 30575 1726867680.87574: done checking for any_errors_fatal 30575 1726867680.87575: checking for max_fail_percentage 30575 1726867680.87576: done checking for max_fail_percentage 30575 1726867680.87579: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.87580: done checking to see if all hosts have failed 30575 1726867680.87580: getting the remaining hosts for this loop 30575 1726867680.87583: done getting the remaining hosts for this loop 30575 1726867680.87588: getting the next task for host managed_node3 30575 1726867680.87598: done getting next task for host managed_node3 30575 1726867680.87602: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867680.87606: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.87636: getting variables 30575 1726867680.87638: in VariableManager get_vars() 30575 1726867680.87691: Calling all_inventory to load vars for managed_node3 30575 1726867680.87693: Calling groups_inventory to load vars for managed_node3 30575 1726867680.87696: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.87701: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b1 30575 1726867680.87703: WORKER PROCESS EXITING 30575 1726867680.87712: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.87715: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.87717: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.88543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.89419: done with get_vars() 30575 1726867680.89438: done getting variables 30575 1726867680.89485: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:28:00 -0400 (0:00:00.062) 0:01:56.272 ****** 30575 1726867680.89511: entering _queue_task() for managed_node3/service 30575 1726867680.89784: worker is 1 (out of 1 available) 30575 1726867680.89799: exiting _queue_task() for managed_node3/service 30575 1726867680.89814: done queuing things up, now waiting for results queue to drain 30575 1726867680.89815: waiting for pending results... 30575 1726867680.90018: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867680.90115: in run() - task 0affcac9-a3a5-e081-a588-0000000024b2 30575 1726867680.90130: variable 'ansible_search_path' from source: unknown 30575 1726867680.90135: variable 'ansible_search_path' from source: unknown 30575 1726867680.90166: calling self._execute() 30575 1726867680.90249: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.90254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.90264: variable 'omit' from source: magic vars 30575 1726867680.90552: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.90560: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.90647: variable 'network_provider' from source: set_fact 30575 1726867680.90650: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867680.90654: when evaluation is False, skipping this task 30575 1726867680.90656: _execute() done 30575 1726867680.90662: dumping result to json 30575 1726867680.90665: done dumping result, returning 30575 1726867680.90672: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-0000000024b2] 30575 1726867680.90679: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b2 30575 1726867680.90764: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b2 30575 1726867680.90767: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867680.90849: no more pending results, returning what we have 30575 1726867680.90852: results queue empty 30575 1726867680.90853: checking for any_errors_fatal 30575 1726867680.90862: done checking for any_errors_fatal 30575 1726867680.90863: checking for max_fail_percentage 30575 1726867680.90864: done checking for max_fail_percentage 30575 1726867680.90865: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.90866: done checking to see if all hosts have failed 30575 1726867680.90867: getting the remaining hosts for this loop 30575 1726867680.90868: done getting the remaining hosts for this loop 30575 1726867680.90872: getting the next task for host managed_node3 30575 1726867680.90882: done getting next task for host managed_node3 30575 1726867680.90885: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867680.90889: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.90913: getting variables 30575 1726867680.90914: in VariableManager get_vars() 30575 1726867680.90955: Calling all_inventory to load vars for managed_node3 30575 1726867680.90957: Calling groups_inventory to load vars for managed_node3 30575 1726867680.90959: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.90968: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.90970: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.90973: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.91901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.92756: done with get_vars() 30575 1726867680.92774: done getting variables 30575 1726867680.92821: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:28:00 -0400 (0:00:00.033) 0:01:56.306 ****** 30575 1726867680.92848: entering _queue_task() for managed_node3/copy 30575 1726867680.93123: worker is 1 (out of 1 available) 30575 1726867680.93137: exiting _queue_task() for managed_node3/copy 30575 1726867680.93151: done queuing things up, now waiting for results queue to drain 30575 1726867680.93152: waiting for pending results... 30575 1726867680.93359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867680.93472: in run() - task 0affcac9-a3a5-e081-a588-0000000024b3 30575 1726867680.93488: variable 'ansible_search_path' from source: unknown 30575 1726867680.93492: variable 'ansible_search_path' from source: unknown 30575 1726867680.93522: calling self._execute() 30575 1726867680.93600: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.93605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.93616: variable 'omit' from source: magic vars 30575 1726867680.93901: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.93909: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.93997: variable 'network_provider' from source: set_fact 30575 1726867680.94000: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867680.94003: when evaluation is False, skipping this task 30575 1726867680.94006: _execute() done 30575 1726867680.94011: dumping result to json 30575 1726867680.94014: done dumping result, returning 30575 1726867680.94027: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-0000000024b3] 30575 1726867680.94030: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b3 30575 1726867680.94124: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b3 30575 1726867680.94126: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867680.94186: no more pending results, returning what we have 30575 1726867680.94190: results queue empty 30575 1726867680.94191: checking for any_errors_fatal 30575 1726867680.94200: done checking for any_errors_fatal 30575 1726867680.94200: checking for max_fail_percentage 30575 1726867680.94202: done checking for max_fail_percentage 30575 1726867680.94203: checking to see if all hosts have failed and the running result is not ok 30575 1726867680.94204: done checking to see if all hosts have failed 30575 1726867680.94205: getting the remaining hosts for this loop 30575 1726867680.94206: done getting the remaining hosts for this loop 30575 1726867680.94210: getting the next task for host managed_node3 30575 1726867680.94219: done getting next task for host managed_node3 30575 1726867680.94223: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867680.94228: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867680.94256: getting variables 30575 1726867680.94257: in VariableManager get_vars() 30575 1726867680.94299: Calling all_inventory to load vars for managed_node3 30575 1726867680.94302: Calling groups_inventory to load vars for managed_node3 30575 1726867680.94304: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867680.94314: Calling all_plugins_play to load vars for managed_node3 30575 1726867680.94316: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867680.94319: Calling groups_plugins_play to load vars for managed_node3 30575 1726867680.95103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867680.95973: done with get_vars() 30575 1726867680.95993: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:28:00 -0400 (0:00:00.032) 0:01:56.338 ****** 30575 1726867680.96057: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867680.96328: worker is 1 (out of 1 available) 30575 1726867680.96342: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867680.96356: done queuing things up, now waiting for results queue to drain 30575 1726867680.96358: waiting for pending results... 30575 1726867680.96565: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867680.96684: in run() - task 0affcac9-a3a5-e081-a588-0000000024b4 30575 1726867680.96694: variable 'ansible_search_path' from source: unknown 30575 1726867680.96697: variable 'ansible_search_path' from source: unknown 30575 1726867680.96731: calling self._execute() 30575 1726867680.96811: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867680.96816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867680.96827: variable 'omit' from source: magic vars 30575 1726867680.97109: variable 'ansible_distribution_major_version' from source: facts 30575 1726867680.97118: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867680.97127: variable 'omit' from source: magic vars 30575 1726867680.97170: variable 'omit' from source: magic vars 30575 1726867680.97285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867680.99091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867680.99138: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867680.99166: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867680.99195: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867680.99216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867680.99278: variable 'network_provider' from source: set_fact 30575 1726867680.99379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867680.99399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867680.99420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867680.99448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867680.99459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867680.99517: variable 'omit' from source: magic vars 30575 1726867680.99590: variable 'omit' from source: magic vars 30575 1726867680.99663: variable 'network_connections' from source: include params 30575 1726867680.99672: variable 'interface' from source: play vars 30575 1726867680.99718: variable 'interface' from source: play vars 30575 1726867680.99826: variable 'omit' from source: magic vars 30575 1726867680.99833: variable '__lsr_ansible_managed' from source: task vars 30575 1726867680.99879: variable '__lsr_ansible_managed' from source: task vars 30575 1726867681.00004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867681.00150: Loaded config def from plugin (lookup/template) 30575 1726867681.00153: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867681.00174: File lookup term: get_ansible_managed.j2 30575 1726867681.00180: variable 'ansible_search_path' from source: unknown 30575 1726867681.00183: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867681.00196: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867681.00209: variable 'ansible_search_path' from source: unknown 30575 1726867681.03516: variable 'ansible_managed' from source: unknown 30575 1726867681.03598: variable 'omit' from source: magic vars 30575 1726867681.03619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867681.03643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867681.03658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867681.03672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.03682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.03705: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867681.03708: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.03710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.03779: Set connection var ansible_pipelining to False 30575 1726867681.03783: Set connection var ansible_shell_type to sh 30575 1726867681.03788: Set connection var ansible_shell_executable to /bin/sh 30575 1726867681.03794: Set connection var ansible_timeout to 10 30575 1726867681.03798: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867681.03805: Set connection var ansible_connection to ssh 30575 1726867681.03826: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.03831: variable 'ansible_connection' from source: unknown 30575 1726867681.03834: variable 'ansible_module_compression' from source: unknown 30575 1726867681.03836: variable 'ansible_shell_type' from source: unknown 30575 1726867681.03838: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.03840: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.03842: variable 'ansible_pipelining' from source: unknown 30575 1726867681.03844: variable 'ansible_timeout' from source: unknown 30575 1726867681.03846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.03941: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867681.03953: variable 'omit' from source: magic vars 30575 1726867681.03958: starting attempt loop 30575 1726867681.03960: running the handler 30575 1726867681.03971: _low_level_execute_command(): starting 30575 1726867681.03978: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867681.04483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.04487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.04494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.04503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.04556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.04559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867681.04561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.04626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.06310: stdout chunk (state=3): >>>/root <<< 30575 1726867681.06406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.06439: stderr chunk (state=3): >>><<< 30575 1726867681.06442: stdout chunk (state=3): >>><<< 30575 1726867681.06463: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867681.06474: _low_level_execute_command(): starting 30575 1726867681.06481: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819 `" && echo ansible-tmp-1726867681.064641-35674-71301667584819="` echo /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819 `" ) && sleep 0' 30575 1726867681.06926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.06929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.06932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867681.06934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867681.06936: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.06980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.06984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.07036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.08931: stdout chunk (state=3): >>>ansible-tmp-1726867681.064641-35674-71301667584819=/root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819 <<< 30575 1726867681.09034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.09068: stderr chunk (state=3): >>><<< 30575 1726867681.09071: stdout chunk (state=3): >>><<< 30575 1726867681.09090: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867681.064641-35674-71301667584819=/root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867681.09131: variable 'ansible_module_compression' from source: unknown 30575 1726867681.09173: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867681.09202: variable 'ansible_facts' from source: unknown 30575 1726867681.09270: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/AnsiballZ_network_connections.py 30575 1726867681.09376: Sending initial data 30575 1726867681.09381: Sent initial data (166 bytes) 30575 1726867681.09848: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.09852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867681.09858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.09860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.09862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867681.09864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.09911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.09914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867681.09918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.09962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.11487: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30575 1726867681.11493: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867681.11530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867681.11576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmppote3p_q /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/AnsiballZ_network_connections.py <<< 30575 1726867681.11586: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/AnsiballZ_network_connections.py" <<< 30575 1726867681.11619: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmppote3p_q" to remote "/root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/AnsiballZ_network_connections.py" <<< 30575 1726867681.12339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.12379: stderr chunk (state=3): >>><<< 30575 1726867681.12382: stdout chunk (state=3): >>><<< 30575 1726867681.12422: done transferring module to remote 30575 1726867681.12430: _low_level_execute_command(): starting 30575 1726867681.12436: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/ /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/AnsiballZ_network_connections.py && sleep 0' 30575 1726867681.12864: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.12869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867681.12871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.12873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.12875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.12924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.12927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.12979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.14692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.14714: stderr chunk (state=3): >>><<< 30575 1726867681.14717: stdout chunk (state=3): >>><<< 30575 1726867681.14732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867681.14735: _low_level_execute_command(): starting 30575 1726867681.14739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/AnsiballZ_network_connections.py && sleep 0' 30575 1726867681.15138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.15142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.15144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.15146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.15196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.15199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.15251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.48355: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h6839e90/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h6839e90/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/0739a9ca-1102-4bed-b35d-0eb6b0f005e6: error=unknown <<< 30575 1726867681.48523: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867681.50368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867681.50406: stderr chunk (state=3): >>><<< 30575 1726867681.50409: stdout chunk (state=3): >>><<< 30575 1726867681.50425: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h6839e90/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h6839e90/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/0739a9ca-1102-4bed-b35d-0eb6b0f005e6: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867681.50454: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867681.50462: _low_level_execute_command(): starting 30575 1726867681.50467: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867681.064641-35674-71301667584819/ > /dev/null 2>&1 && sleep 0' 30575 1726867681.50922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867681.50926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867681.50930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.50932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.50934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.50982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.50998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867681.51001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.51041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.52908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.52935: stderr chunk (state=3): >>><<< 30575 1726867681.52940: stdout chunk (state=3): >>><<< 30575 1726867681.52953: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867681.52960: handler run complete 30575 1726867681.52985: attempt loop complete, returning result 30575 1726867681.52988: _execute() done 30575 1726867681.52990: dumping result to json 30575 1726867681.52995: done dumping result, returning 30575 1726867681.53004: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-0000000024b4] 30575 1726867681.53008: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b4 30575 1726867681.53105: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b4 30575 1726867681.53108: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30575 1726867681.53209: no more pending results, returning what we have 30575 1726867681.53212: results queue empty 30575 1726867681.53213: checking for any_errors_fatal 30575 1726867681.53221: done checking for any_errors_fatal 30575 1726867681.53222: checking for max_fail_percentage 30575 1726867681.53224: done checking for max_fail_percentage 30575 1726867681.53224: checking to see if all hosts have failed and the running result is not ok 30575 1726867681.53225: done checking to see if all hosts have failed 30575 1726867681.53226: getting the remaining hosts for this loop 30575 1726867681.53227: done getting the remaining hosts for this loop 30575 1726867681.53230: getting the next task for host managed_node3 30575 1726867681.53238: done getting next task for host managed_node3 30575 1726867681.53241: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867681.53245: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867681.53258: getting variables 30575 1726867681.53259: in VariableManager get_vars() 30575 1726867681.53309: Calling all_inventory to load vars for managed_node3 30575 1726867681.53311: Calling groups_inventory to load vars for managed_node3 30575 1726867681.53313: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867681.53325: Calling all_plugins_play to load vars for managed_node3 30575 1726867681.53327: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867681.53330: Calling groups_plugins_play to load vars for managed_node3 30575 1726867681.54296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867681.55156: done with get_vars() 30575 1726867681.55172: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:28:01 -0400 (0:00:00.591) 0:01:56.929 ****** 30575 1726867681.55238: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867681.55488: worker is 1 (out of 1 available) 30575 1726867681.55503: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867681.55520: done queuing things up, now waiting for results queue to drain 30575 1726867681.55522: waiting for pending results... 30575 1726867681.55715: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867681.55804: in run() - task 0affcac9-a3a5-e081-a588-0000000024b5 30575 1726867681.55819: variable 'ansible_search_path' from source: unknown 30575 1726867681.55823: variable 'ansible_search_path' from source: unknown 30575 1726867681.55851: calling self._execute() 30575 1726867681.55934: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.55942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.55982: variable 'omit' from source: magic vars 30575 1726867681.56233: variable 'ansible_distribution_major_version' from source: facts 30575 1726867681.56242: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867681.56327: variable 'network_state' from source: role '' defaults 30575 1726867681.56337: Evaluated conditional (network_state != {}): False 30575 1726867681.56339: when evaluation is False, skipping this task 30575 1726867681.56342: _execute() done 30575 1726867681.56346: dumping result to json 30575 1726867681.56349: done dumping result, returning 30575 1726867681.56357: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-0000000024b5] 30575 1726867681.56362: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b5 30575 1726867681.56450: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b5 30575 1726867681.56453: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867681.56505: no more pending results, returning what we have 30575 1726867681.56509: results queue empty 30575 1726867681.56509: checking for any_errors_fatal 30575 1726867681.56523: done checking for any_errors_fatal 30575 1726867681.56524: checking for max_fail_percentage 30575 1726867681.56525: done checking for max_fail_percentage 30575 1726867681.56526: checking to see if all hosts have failed and the running result is not ok 30575 1726867681.56527: done checking to see if all hosts have failed 30575 1726867681.56527: getting the remaining hosts for this loop 30575 1726867681.56529: done getting the remaining hosts for this loop 30575 1726867681.56532: getting the next task for host managed_node3 30575 1726867681.56540: done getting next task for host managed_node3 30575 1726867681.56543: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867681.56547: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867681.56571: getting variables 30575 1726867681.56572: in VariableManager get_vars() 30575 1726867681.56613: Calling all_inventory to load vars for managed_node3 30575 1726867681.56616: Calling groups_inventory to load vars for managed_node3 30575 1726867681.56620: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867681.56630: Calling all_plugins_play to load vars for managed_node3 30575 1726867681.56632: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867681.56635: Calling groups_plugins_play to load vars for managed_node3 30575 1726867681.57394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867681.59014: done with get_vars() 30575 1726867681.59038: done getting variables 30575 1726867681.59093: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:28:01 -0400 (0:00:00.038) 0:01:56.968 ****** 30575 1726867681.59120: entering _queue_task() for managed_node3/debug 30575 1726867681.59353: worker is 1 (out of 1 available) 30575 1726867681.59366: exiting _queue_task() for managed_node3/debug 30575 1726867681.59380: done queuing things up, now waiting for results queue to drain 30575 1726867681.59381: waiting for pending results... 30575 1726867681.59563: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867681.59674: in run() - task 0affcac9-a3a5-e081-a588-0000000024b6 30575 1726867681.59691: variable 'ansible_search_path' from source: unknown 30575 1726867681.59694: variable 'ansible_search_path' from source: unknown 30575 1726867681.59727: calling self._execute() 30575 1726867681.59806: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.59810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.59821: variable 'omit' from source: magic vars 30575 1726867681.60099: variable 'ansible_distribution_major_version' from source: facts 30575 1726867681.60108: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867681.60114: variable 'omit' from source: magic vars 30575 1726867681.60163: variable 'omit' from source: magic vars 30575 1726867681.60188: variable 'omit' from source: magic vars 30575 1726867681.60230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867681.60257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867681.60274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867681.60290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.60300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.60327: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867681.60330: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.60333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.60404: Set connection var ansible_pipelining to False 30575 1726867681.60408: Set connection var ansible_shell_type to sh 30575 1726867681.60412: Set connection var ansible_shell_executable to /bin/sh 30575 1726867681.60417: Set connection var ansible_timeout to 10 30575 1726867681.60424: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867681.60431: Set connection var ansible_connection to ssh 30575 1726867681.60449: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.60452: variable 'ansible_connection' from source: unknown 30575 1726867681.60455: variable 'ansible_module_compression' from source: unknown 30575 1726867681.60457: variable 'ansible_shell_type' from source: unknown 30575 1726867681.60459: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.60463: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.60467: variable 'ansible_pipelining' from source: unknown 30575 1726867681.60469: variable 'ansible_timeout' from source: unknown 30575 1726867681.60473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.60571: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867681.60581: variable 'omit' from source: magic vars 30575 1726867681.60587: starting attempt loop 30575 1726867681.60590: running the handler 30575 1726867681.60686: variable '__network_connections_result' from source: set_fact 30575 1726867681.60732: handler run complete 30575 1726867681.60745: attempt loop complete, returning result 30575 1726867681.60748: _execute() done 30575 1726867681.60751: dumping result to json 30575 1726867681.60753: done dumping result, returning 30575 1726867681.60762: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-0000000024b6] 30575 1726867681.60766: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b6 30575 1726867681.60844: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b6 30575 1726867681.60847: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 30575 1726867681.60922: no more pending results, returning what we have 30575 1726867681.60926: results queue empty 30575 1726867681.60926: checking for any_errors_fatal 30575 1726867681.60934: done checking for any_errors_fatal 30575 1726867681.60935: checking for max_fail_percentage 30575 1726867681.60936: done checking for max_fail_percentage 30575 1726867681.60937: checking to see if all hosts have failed and the running result is not ok 30575 1726867681.60938: done checking to see if all hosts have failed 30575 1726867681.60939: getting the remaining hosts for this loop 30575 1726867681.60940: done getting the remaining hosts for this loop 30575 1726867681.60944: getting the next task for host managed_node3 30575 1726867681.60951: done getting next task for host managed_node3 30575 1726867681.60955: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867681.60959: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867681.60971: getting variables 30575 1726867681.60973: in VariableManager get_vars() 30575 1726867681.61016: Calling all_inventory to load vars for managed_node3 30575 1726867681.61019: Calling groups_inventory to load vars for managed_node3 30575 1726867681.61021: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867681.61031: Calling all_plugins_play to load vars for managed_node3 30575 1726867681.61033: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867681.61036: Calling groups_plugins_play to load vars for managed_node3 30575 1726867681.61786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867681.62644: done with get_vars() 30575 1726867681.62659: done getting variables 30575 1726867681.62701: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:28:01 -0400 (0:00:00.036) 0:01:57.004 ****** 30575 1726867681.62730: entering _queue_task() for managed_node3/debug 30575 1726867681.62935: worker is 1 (out of 1 available) 30575 1726867681.62950: exiting _queue_task() for managed_node3/debug 30575 1726867681.62963: done queuing things up, now waiting for results queue to drain 30575 1726867681.62965: waiting for pending results... 30575 1726867681.63151: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867681.63241: in run() - task 0affcac9-a3a5-e081-a588-0000000024b7 30575 1726867681.63254: variable 'ansible_search_path' from source: unknown 30575 1726867681.63257: variable 'ansible_search_path' from source: unknown 30575 1726867681.63286: calling self._execute() 30575 1726867681.63365: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.63368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.63380: variable 'omit' from source: magic vars 30575 1726867681.63654: variable 'ansible_distribution_major_version' from source: facts 30575 1726867681.63664: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867681.63670: variable 'omit' from source: magic vars 30575 1726867681.63710: variable 'omit' from source: magic vars 30575 1726867681.63739: variable 'omit' from source: magic vars 30575 1726867681.63771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867681.63798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867681.63813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867681.63829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.63840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.63864: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867681.63867: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.63870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.63938: Set connection var ansible_pipelining to False 30575 1726867681.63941: Set connection var ansible_shell_type to sh 30575 1726867681.63944: Set connection var ansible_shell_executable to /bin/sh 30575 1726867681.63951: Set connection var ansible_timeout to 10 30575 1726867681.63954: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867681.64073: Set connection var ansible_connection to ssh 30575 1726867681.64076: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.64082: variable 'ansible_connection' from source: unknown 30575 1726867681.64085: variable 'ansible_module_compression' from source: unknown 30575 1726867681.64087: variable 'ansible_shell_type' from source: unknown 30575 1726867681.64089: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.64091: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.64093: variable 'ansible_pipelining' from source: unknown 30575 1726867681.64095: variable 'ansible_timeout' from source: unknown 30575 1726867681.64097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.64099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867681.64102: variable 'omit' from source: magic vars 30575 1726867681.64108: starting attempt loop 30575 1726867681.64110: running the handler 30575 1726867681.64152: variable '__network_connections_result' from source: set_fact 30575 1726867681.64208: variable '__network_connections_result' from source: set_fact 30575 1726867681.64288: handler run complete 30575 1726867681.64305: attempt loop complete, returning result 30575 1726867681.64308: _execute() done 30575 1726867681.64310: dumping result to json 30575 1726867681.64313: done dumping result, returning 30575 1726867681.64324: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-0000000024b7] 30575 1726867681.64327: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b7 30575 1726867681.64411: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b7 30575 1726867681.64414: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30575 1726867681.64501: no more pending results, returning what we have 30575 1726867681.64505: results queue empty 30575 1726867681.64505: checking for any_errors_fatal 30575 1726867681.64509: done checking for any_errors_fatal 30575 1726867681.64510: checking for max_fail_percentage 30575 1726867681.64511: done checking for max_fail_percentage 30575 1726867681.64511: checking to see if all hosts have failed and the running result is not ok 30575 1726867681.64512: done checking to see if all hosts have failed 30575 1726867681.64513: getting the remaining hosts for this loop 30575 1726867681.64514: done getting the remaining hosts for this loop 30575 1726867681.64517: getting the next task for host managed_node3 30575 1726867681.64524: done getting next task for host managed_node3 30575 1726867681.64527: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867681.64531: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867681.64543: getting variables 30575 1726867681.64544: in VariableManager get_vars() 30575 1726867681.64584: Calling all_inventory to load vars for managed_node3 30575 1726867681.64586: Calling groups_inventory to load vars for managed_node3 30575 1726867681.64588: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867681.64596: Calling all_plugins_play to load vars for managed_node3 30575 1726867681.64598: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867681.64605: Calling groups_plugins_play to load vars for managed_node3 30575 1726867681.65475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867681.66310: done with get_vars() 30575 1726867681.66327: done getting variables 30575 1726867681.66364: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:28:01 -0400 (0:00:00.036) 0:01:57.041 ****** 30575 1726867681.66389: entering _queue_task() for managed_node3/debug 30575 1726867681.66594: worker is 1 (out of 1 available) 30575 1726867681.66609: exiting _queue_task() for managed_node3/debug 30575 1726867681.66621: done queuing things up, now waiting for results queue to drain 30575 1726867681.66623: waiting for pending results... 30575 1726867681.66804: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867681.66894: in run() - task 0affcac9-a3a5-e081-a588-0000000024b8 30575 1726867681.66905: variable 'ansible_search_path' from source: unknown 30575 1726867681.66908: variable 'ansible_search_path' from source: unknown 30575 1726867681.66939: calling self._execute() 30575 1726867681.67015: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.67018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.67030: variable 'omit' from source: magic vars 30575 1726867681.67303: variable 'ansible_distribution_major_version' from source: facts 30575 1726867681.67313: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867681.67401: variable 'network_state' from source: role '' defaults 30575 1726867681.67407: Evaluated conditional (network_state != {}): False 30575 1726867681.67410: when evaluation is False, skipping this task 30575 1726867681.67414: _execute() done 30575 1726867681.67416: dumping result to json 30575 1726867681.67423: done dumping result, returning 30575 1726867681.67431: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-0000000024b8] 30575 1726867681.67436: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b8 30575 1726867681.67516: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b8 30575 1726867681.67520: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867681.67565: no more pending results, returning what we have 30575 1726867681.67569: results queue empty 30575 1726867681.67570: checking for any_errors_fatal 30575 1726867681.67580: done checking for any_errors_fatal 30575 1726867681.67581: checking for max_fail_percentage 30575 1726867681.67582: done checking for max_fail_percentage 30575 1726867681.67583: checking to see if all hosts have failed and the running result is not ok 30575 1726867681.67584: done checking to see if all hosts have failed 30575 1726867681.67585: getting the remaining hosts for this loop 30575 1726867681.67586: done getting the remaining hosts for this loop 30575 1726867681.67590: getting the next task for host managed_node3 30575 1726867681.67596: done getting next task for host managed_node3 30575 1726867681.67600: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867681.67604: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867681.67625: getting variables 30575 1726867681.67627: in VariableManager get_vars() 30575 1726867681.67663: Calling all_inventory to load vars for managed_node3 30575 1726867681.67665: Calling groups_inventory to load vars for managed_node3 30575 1726867681.67667: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867681.67674: Calling all_plugins_play to load vars for managed_node3 30575 1726867681.67681: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867681.67685: Calling groups_plugins_play to load vars for managed_node3 30575 1726867681.68415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867681.69269: done with get_vars() 30575 1726867681.69286: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:28:01 -0400 (0:00:00.029) 0:01:57.071 ****** 30575 1726867681.69348: entering _queue_task() for managed_node3/ping 30575 1726867681.69543: worker is 1 (out of 1 available) 30575 1726867681.69556: exiting _queue_task() for managed_node3/ping 30575 1726867681.69569: done queuing things up, now waiting for results queue to drain 30575 1726867681.69571: waiting for pending results... 30575 1726867681.69755: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867681.69834: in run() - task 0affcac9-a3a5-e081-a588-0000000024b9 30575 1726867681.69846: variable 'ansible_search_path' from source: unknown 30575 1726867681.69849: variable 'ansible_search_path' from source: unknown 30575 1726867681.69876: calling self._execute() 30575 1726867681.69956: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.69960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.69969: variable 'omit' from source: magic vars 30575 1726867681.70247: variable 'ansible_distribution_major_version' from source: facts 30575 1726867681.70256: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867681.70262: variable 'omit' from source: magic vars 30575 1726867681.70310: variable 'omit' from source: magic vars 30575 1726867681.70339: variable 'omit' from source: magic vars 30575 1726867681.70366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867681.70393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867681.70409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867681.70425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.70436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867681.70461: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867681.70465: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.70468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.70538: Set connection var ansible_pipelining to False 30575 1726867681.70541: Set connection var ansible_shell_type to sh 30575 1726867681.70543: Set connection var ansible_shell_executable to /bin/sh 30575 1726867681.70554: Set connection var ansible_timeout to 10 30575 1726867681.70557: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867681.70561: Set connection var ansible_connection to ssh 30575 1726867681.70580: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.70583: variable 'ansible_connection' from source: unknown 30575 1726867681.70586: variable 'ansible_module_compression' from source: unknown 30575 1726867681.70589: variable 'ansible_shell_type' from source: unknown 30575 1726867681.70591: variable 'ansible_shell_executable' from source: unknown 30575 1726867681.70593: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867681.70596: variable 'ansible_pipelining' from source: unknown 30575 1726867681.70598: variable 'ansible_timeout' from source: unknown 30575 1726867681.70603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867681.70747: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867681.70755: variable 'omit' from source: magic vars 30575 1726867681.70760: starting attempt loop 30575 1726867681.70762: running the handler 30575 1726867681.70778: _low_level_execute_command(): starting 30575 1726867681.70785: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867681.71296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.71299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.71302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.71306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.71358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.71361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867681.71363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.71422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.73120: stdout chunk (state=3): >>>/root <<< 30575 1726867681.73216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.73250: stderr chunk (state=3): >>><<< 30575 1726867681.73253: stdout chunk (state=3): >>><<< 30575 1726867681.73274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867681.73288: _low_level_execute_command(): starting 30575 1726867681.73294: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974 `" && echo ansible-tmp-1726867681.7327442-35689-180810229008974="` echo /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974 `" ) && sleep 0' 30575 1726867681.73746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867681.73749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867681.73752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.73762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867681.73764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867681.73766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.73810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.73817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867681.73820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.73862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.75763: stdout chunk (state=3): >>>ansible-tmp-1726867681.7327442-35689-180810229008974=/root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974 <<< 30575 1726867681.75867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.75896: stderr chunk (state=3): >>><<< 30575 1726867681.75899: stdout chunk (state=3): >>><<< 30575 1726867681.75916: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867681.7327442-35689-180810229008974=/root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867681.75958: variable 'ansible_module_compression' from source: unknown 30575 1726867681.75992: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867681.76025: variable 'ansible_facts' from source: unknown 30575 1726867681.76082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/AnsiballZ_ping.py 30575 1726867681.76183: Sending initial data 30575 1726867681.76186: Sent initial data (153 bytes) 30575 1726867681.76626: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867681.76629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867681.76632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.76634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867681.76636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867681.76638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.76682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.76687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.76736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.78338: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867681.78345: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867681.78380: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867681.78425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9uqadygd /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/AnsiballZ_ping.py <<< 30575 1726867681.78429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/AnsiballZ_ping.py" <<< 30575 1726867681.78468: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp9uqadygd" to remote "/root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/AnsiballZ_ping.py" <<< 30575 1726867681.78996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.79032: stderr chunk (state=3): >>><<< 30575 1726867681.79035: stdout chunk (state=3): >>><<< 30575 1726867681.79071: done transferring module to remote 30575 1726867681.79080: _low_level_execute_command(): starting 30575 1726867681.79085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/ /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/AnsiballZ_ping.py && sleep 0' 30575 1726867681.79504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.79507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867681.79509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867681.79515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.79562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.79565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.79613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.81373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867681.81396: stderr chunk (state=3): >>><<< 30575 1726867681.81399: stdout chunk (state=3): >>><<< 30575 1726867681.81411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867681.81414: _low_level_execute_command(): starting 30575 1726867681.81421: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/AnsiballZ_ping.py && sleep 0' 30575 1726867681.81826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867681.81829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.81831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867681.81833: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867681.81835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.81882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.81897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.81940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867681.97033: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867681.98372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867681.98403: stderr chunk (state=3): >>><<< 30575 1726867681.98406: stdout chunk (state=3): >>><<< 30575 1726867681.98424: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867681.98451: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867681.98458: _low_level_execute_command(): starting 30575 1726867681.98464: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867681.7327442-35689-180810229008974/ > /dev/null 2>&1 && sleep 0' 30575 1726867681.98904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867681.98908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.98921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867681.98974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867681.98988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867681.98994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867681.99028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867682.00848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867682.00871: stderr chunk (state=3): >>><<< 30575 1726867682.00874: stdout chunk (state=3): >>><<< 30575 1726867682.00890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867682.00895: handler run complete 30575 1726867682.00909: attempt loop complete, returning result 30575 1726867682.00912: _execute() done 30575 1726867682.00914: dumping result to json 30575 1726867682.00916: done dumping result, returning 30575 1726867682.00930: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-0000000024b9] 30575 1726867682.00933: sending task result for task 0affcac9-a3a5-e081-a588-0000000024b9 30575 1726867682.01021: done sending task result for task 0affcac9-a3a5-e081-a588-0000000024b9 30575 1726867682.01023: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867682.01096: no more pending results, returning what we have 30575 1726867682.01099: results queue empty 30575 1726867682.01100: checking for any_errors_fatal 30575 1726867682.01109: done checking for any_errors_fatal 30575 1726867682.01110: checking for max_fail_percentage 30575 1726867682.01111: done checking for max_fail_percentage 30575 1726867682.01112: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.01113: done checking to see if all hosts have failed 30575 1726867682.01114: getting the remaining hosts for this loop 30575 1726867682.01115: done getting the remaining hosts for this loop 30575 1726867682.01119: getting the next task for host managed_node3 30575 1726867682.01130: done getting next task for host managed_node3 30575 1726867682.01133: ^ task is: TASK: meta (role_complete) 30575 1726867682.01137: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.01150: getting variables 30575 1726867682.01152: in VariableManager get_vars() 30575 1726867682.01208: Calling all_inventory to load vars for managed_node3 30575 1726867682.01211: Calling groups_inventory to load vars for managed_node3 30575 1726867682.01213: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.01222: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.01224: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.01227: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.02188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.03032: done with get_vars() 30575 1726867682.03048: done getting variables 30575 1726867682.03108: done queuing things up, now waiting for results queue to drain 30575 1726867682.03110: results queue empty 30575 1726867682.03110: checking for any_errors_fatal 30575 1726867682.03112: done checking for any_errors_fatal 30575 1726867682.03112: checking for max_fail_percentage 30575 1726867682.03113: done checking for max_fail_percentage 30575 1726867682.03113: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.03114: done checking to see if all hosts have failed 30575 1726867682.03114: getting the remaining hosts for this loop 30575 1726867682.03115: done getting the remaining hosts for this loop 30575 1726867682.03117: getting the next task for host managed_node3 30575 1726867682.03121: done getting next task for host managed_node3 30575 1726867682.03123: ^ task is: TASK: Test 30575 1726867682.03125: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.03128: getting variables 30575 1726867682.03129: in VariableManager get_vars() 30575 1726867682.03138: Calling all_inventory to load vars for managed_node3 30575 1726867682.03139: Calling groups_inventory to load vars for managed_node3 30575 1726867682.03140: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.03144: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.03145: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.03147: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.03763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.04596: done with get_vars() 30575 1726867682.04610: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:28:02 -0400 (0:00:00.353) 0:01:57.424 ****** 30575 1726867682.04664: entering _queue_task() for managed_node3/include_tasks 30575 1726867682.04927: worker is 1 (out of 1 available) 30575 1726867682.04941: exiting _queue_task() for managed_node3/include_tasks 30575 1726867682.04955: done queuing things up, now waiting for results queue to drain 30575 1726867682.04956: waiting for pending results... 30575 1726867682.05153: running TaskExecutor() for managed_node3/TASK: Test 30575 1726867682.05240: in run() - task 0affcac9-a3a5-e081-a588-0000000020b1 30575 1726867682.05252: variable 'ansible_search_path' from source: unknown 30575 1726867682.05257: variable 'ansible_search_path' from source: unknown 30575 1726867682.05297: variable 'lsr_test' from source: include params 30575 1726867682.05466: variable 'lsr_test' from source: include params 30575 1726867682.05529: variable 'omit' from source: magic vars 30575 1726867682.05629: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.05636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.05645: variable 'omit' from source: magic vars 30575 1726867682.05813: variable 'ansible_distribution_major_version' from source: facts 30575 1726867682.05823: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867682.05829: variable 'item' from source: unknown 30575 1726867682.05876: variable 'item' from source: unknown 30575 1726867682.05900: variable 'item' from source: unknown 30575 1726867682.05949: variable 'item' from source: unknown 30575 1726867682.06079: dumping result to json 30575 1726867682.06082: done dumping result, returning 30575 1726867682.06084: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-e081-a588-0000000020b1] 30575 1726867682.06086: sending task result for task 0affcac9-a3a5-e081-a588-0000000020b1 30575 1726867682.06121: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020b1 30575 1726867682.06123: WORKER PROCESS EXITING 30575 1726867682.06143: no more pending results, returning what we have 30575 1726867682.06149: in VariableManager get_vars() 30575 1726867682.06204: Calling all_inventory to load vars for managed_node3 30575 1726867682.06207: Calling groups_inventory to load vars for managed_node3 30575 1726867682.06211: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.06222: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.06225: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.06227: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.07102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.07939: done with get_vars() 30575 1726867682.07953: variable 'ansible_search_path' from source: unknown 30575 1726867682.07954: variable 'ansible_search_path' from source: unknown 30575 1726867682.07983: we have included files to process 30575 1726867682.07984: generating all_blocks data 30575 1726867682.07985: done generating all_blocks data 30575 1726867682.07989: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867682.07990: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867682.07991: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30575 1726867682.08074: done processing included file 30575 1726867682.08076: iterating over new_blocks loaded from include file 30575 1726867682.08078: in VariableManager get_vars() 30575 1726867682.08090: done with get_vars() 30575 1726867682.08091: filtering new block on tags 30575 1726867682.08108: done filtering new block on tags 30575 1726867682.08110: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node3 => (item=tasks/remove+down_profile.yml) 30575 1726867682.08113: extending task lists for all hosts with included blocks 30575 1726867682.08638: done extending task lists 30575 1726867682.08639: done processing included files 30575 1726867682.08640: results queue empty 30575 1726867682.08640: checking for any_errors_fatal 30575 1726867682.08641: done checking for any_errors_fatal 30575 1726867682.08642: checking for max_fail_percentage 30575 1726867682.08642: done checking for max_fail_percentage 30575 1726867682.08643: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.08644: done checking to see if all hosts have failed 30575 1726867682.08644: getting the remaining hosts for this loop 30575 1726867682.08645: done getting the remaining hosts for this loop 30575 1726867682.08646: getting the next task for host managed_node3 30575 1726867682.08649: done getting next task for host managed_node3 30575 1726867682.08650: ^ task is: TASK: Include network role 30575 1726867682.08653: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.08654: getting variables 30575 1726867682.08655: in VariableManager get_vars() 30575 1726867682.08663: Calling all_inventory to load vars for managed_node3 30575 1726867682.08665: Calling groups_inventory to load vars for managed_node3 30575 1726867682.08666: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.08671: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.08673: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.08675: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.09359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.14089: done with get_vars() 30575 1726867682.14105: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 17:28:02 -0400 (0:00:00.094) 0:01:57.519 ****** 30575 1726867682.14157: entering _queue_task() for managed_node3/include_role 30575 1726867682.14439: worker is 1 (out of 1 available) 30575 1726867682.14452: exiting _queue_task() for managed_node3/include_role 30575 1726867682.14468: done queuing things up, now waiting for results queue to drain 30575 1726867682.14470: waiting for pending results... 30575 1726867682.14658: running TaskExecutor() for managed_node3/TASK: Include network role 30575 1726867682.14759: in run() - task 0affcac9-a3a5-e081-a588-000000002612 30575 1726867682.14771: variable 'ansible_search_path' from source: unknown 30575 1726867682.14776: variable 'ansible_search_path' from source: unknown 30575 1726867682.14809: calling self._execute() 30575 1726867682.14889: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.14894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.14904: variable 'omit' from source: magic vars 30575 1726867682.15194: variable 'ansible_distribution_major_version' from source: facts 30575 1726867682.15203: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867682.15209: _execute() done 30575 1726867682.15214: dumping result to json 30575 1726867682.15217: done dumping result, returning 30575 1726867682.15227: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-e081-a588-000000002612] 30575 1726867682.15234: sending task result for task 0affcac9-a3a5-e081-a588-000000002612 30575 1726867682.15337: done sending task result for task 0affcac9-a3a5-e081-a588-000000002612 30575 1726867682.15340: WORKER PROCESS EXITING 30575 1726867682.15373: no more pending results, returning what we have 30575 1726867682.15381: in VariableManager get_vars() 30575 1726867682.15430: Calling all_inventory to load vars for managed_node3 30575 1726867682.15432: Calling groups_inventory to load vars for managed_node3 30575 1726867682.15436: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.15447: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.15450: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.15454: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.16230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.17081: done with get_vars() 30575 1726867682.17096: variable 'ansible_search_path' from source: unknown 30575 1726867682.17097: variable 'ansible_search_path' from source: unknown 30575 1726867682.17181: variable 'omit' from source: magic vars 30575 1726867682.17211: variable 'omit' from source: magic vars 30575 1726867682.17221: variable 'omit' from source: magic vars 30575 1726867682.17224: we have included files to process 30575 1726867682.17224: generating all_blocks data 30575 1726867682.17225: done generating all_blocks data 30575 1726867682.17227: processing included file: fedora.linux_system_roles.network 30575 1726867682.17239: in VariableManager get_vars() 30575 1726867682.17248: done with get_vars() 30575 1726867682.17267: in VariableManager get_vars() 30575 1726867682.17280: done with get_vars() 30575 1726867682.17308: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30575 1726867682.17381: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30575 1726867682.17430: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30575 1726867682.17691: in VariableManager get_vars() 30575 1726867682.17704: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867682.18885: iterating over new_blocks loaded from include file 30575 1726867682.18886: in VariableManager get_vars() 30575 1726867682.18898: done with get_vars() 30575 1726867682.18899: filtering new block on tags 30575 1726867682.19058: done filtering new block on tags 30575 1726867682.19061: in VariableManager get_vars() 30575 1726867682.19071: done with get_vars() 30575 1726867682.19072: filtering new block on tags 30575 1726867682.19084: done filtering new block on tags 30575 1726867682.19086: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30575 1726867682.19090: extending task lists for all hosts with included blocks 30575 1726867682.19153: done extending task lists 30575 1726867682.19154: done processing included files 30575 1726867682.19154: results queue empty 30575 1726867682.19155: checking for any_errors_fatal 30575 1726867682.19158: done checking for any_errors_fatal 30575 1726867682.19159: checking for max_fail_percentage 30575 1726867682.19160: done checking for max_fail_percentage 30575 1726867682.19160: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.19161: done checking to see if all hosts have failed 30575 1726867682.19161: getting the remaining hosts for this loop 30575 1726867682.19162: done getting the remaining hosts for this loop 30575 1726867682.19164: getting the next task for host managed_node3 30575 1726867682.19167: done getting next task for host managed_node3 30575 1726867682.19168: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867682.19170: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.19179: getting variables 30575 1726867682.19180: in VariableManager get_vars() 30575 1726867682.19189: Calling all_inventory to load vars for managed_node3 30575 1726867682.19190: Calling groups_inventory to load vars for managed_node3 30575 1726867682.19191: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.19195: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.19196: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.19198: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.19863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.20699: done with get_vars() 30575 1726867682.20714: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:28:02 -0400 (0:00:00.066) 0:01:57.585 ****** 30575 1726867682.20763: entering _queue_task() for managed_node3/include_tasks 30575 1726867682.21032: worker is 1 (out of 1 available) 30575 1726867682.21046: exiting _queue_task() for managed_node3/include_tasks 30575 1726867682.21059: done queuing things up, now waiting for results queue to drain 30575 1726867682.21060: waiting for pending results... 30575 1726867682.21258: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30575 1726867682.21352: in run() - task 0affcac9-a3a5-e081-a588-000000002694 30575 1726867682.21364: variable 'ansible_search_path' from source: unknown 30575 1726867682.21367: variable 'ansible_search_path' from source: unknown 30575 1726867682.21401: calling self._execute() 30575 1726867682.21483: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.21488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.21500: variable 'omit' from source: magic vars 30575 1726867682.21791: variable 'ansible_distribution_major_version' from source: facts 30575 1726867682.21799: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867682.21805: _execute() done 30575 1726867682.21809: dumping result to json 30575 1726867682.21811: done dumping result, returning 30575 1726867682.21819: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-e081-a588-000000002694] 30575 1726867682.21828: sending task result for task 0affcac9-a3a5-e081-a588-000000002694 30575 1726867682.21909: done sending task result for task 0affcac9-a3a5-e081-a588-000000002694 30575 1726867682.21911: WORKER PROCESS EXITING 30575 1726867682.21986: no more pending results, returning what we have 30575 1726867682.21991: in VariableManager get_vars() 30575 1726867682.22046: Calling all_inventory to load vars for managed_node3 30575 1726867682.22049: Calling groups_inventory to load vars for managed_node3 30575 1726867682.22051: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.22061: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.22064: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.22066: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.22926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.23787: done with get_vars() 30575 1726867682.23801: variable 'ansible_search_path' from source: unknown 30575 1726867682.23802: variable 'ansible_search_path' from source: unknown 30575 1726867682.23828: we have included files to process 30575 1726867682.23829: generating all_blocks data 30575 1726867682.23830: done generating all_blocks data 30575 1726867682.23832: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867682.23833: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867682.23834: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30575 1726867682.24201: done processing included file 30575 1726867682.24203: iterating over new_blocks loaded from include file 30575 1726867682.24204: in VariableManager get_vars() 30575 1726867682.24220: done with get_vars() 30575 1726867682.24222: filtering new block on tags 30575 1726867682.24242: done filtering new block on tags 30575 1726867682.24244: in VariableManager get_vars() 30575 1726867682.24259: done with get_vars() 30575 1726867682.24260: filtering new block on tags 30575 1726867682.24288: done filtering new block on tags 30575 1726867682.24289: in VariableManager get_vars() 30575 1726867682.24305: done with get_vars() 30575 1726867682.24306: filtering new block on tags 30575 1726867682.24329: done filtering new block on tags 30575 1726867682.24330: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30575 1726867682.24334: extending task lists for all hosts with included blocks 30575 1726867682.25281: done extending task lists 30575 1726867682.25282: done processing included files 30575 1726867682.25283: results queue empty 30575 1726867682.25283: checking for any_errors_fatal 30575 1726867682.25285: done checking for any_errors_fatal 30575 1726867682.25286: checking for max_fail_percentage 30575 1726867682.25286: done checking for max_fail_percentage 30575 1726867682.25287: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.25287: done checking to see if all hosts have failed 30575 1726867682.25288: getting the remaining hosts for this loop 30575 1726867682.25289: done getting the remaining hosts for this loop 30575 1726867682.25290: getting the next task for host managed_node3 30575 1726867682.25294: done getting next task for host managed_node3 30575 1726867682.25296: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867682.25298: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.25307: getting variables 30575 1726867682.25308: in VariableManager get_vars() 30575 1726867682.25318: Calling all_inventory to load vars for managed_node3 30575 1726867682.25320: Calling groups_inventory to load vars for managed_node3 30575 1726867682.25321: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.25326: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.25327: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.25329: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.25954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.26807: done with get_vars() 30575 1726867682.26825: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:28:02 -0400 (0:00:00.061) 0:01:57.646 ****** 30575 1726867682.26883: entering _queue_task() for managed_node3/setup 30575 1726867682.27156: worker is 1 (out of 1 available) 30575 1726867682.27170: exiting _queue_task() for managed_node3/setup 30575 1726867682.27185: done queuing things up, now waiting for results queue to drain 30575 1726867682.27187: waiting for pending results... 30575 1726867682.27387: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30575 1726867682.27490: in run() - task 0affcac9-a3a5-e081-a588-0000000026eb 30575 1726867682.27504: variable 'ansible_search_path' from source: unknown 30575 1726867682.27508: variable 'ansible_search_path' from source: unknown 30575 1726867682.27540: calling self._execute() 30575 1726867682.27617: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.27625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.27636: variable 'omit' from source: magic vars 30575 1726867682.27918: variable 'ansible_distribution_major_version' from source: facts 30575 1726867682.27930: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867682.28081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867682.29747: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867682.29792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867682.29825: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867682.29850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867682.29870: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867682.29935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867682.29956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867682.29974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867682.30001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867682.30014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867682.30054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867682.30071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867682.30089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867682.30114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867682.30131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867682.30237: variable '__network_required_facts' from source: role '' defaults 30575 1726867682.30250: variable 'ansible_facts' from source: unknown 30575 1726867682.30688: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30575 1726867682.30692: when evaluation is False, skipping this task 30575 1726867682.30695: _execute() done 30575 1726867682.30697: dumping result to json 30575 1726867682.30699: done dumping result, returning 30575 1726867682.30707: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-e081-a588-0000000026eb] 30575 1726867682.30712: sending task result for task 0affcac9-a3a5-e081-a588-0000000026eb 30575 1726867682.30798: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026eb 30575 1726867682.30801: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867682.30849: no more pending results, returning what we have 30575 1726867682.30853: results queue empty 30575 1726867682.30854: checking for any_errors_fatal 30575 1726867682.30855: done checking for any_errors_fatal 30575 1726867682.30856: checking for max_fail_percentage 30575 1726867682.30858: done checking for max_fail_percentage 30575 1726867682.30859: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.30860: done checking to see if all hosts have failed 30575 1726867682.30860: getting the remaining hosts for this loop 30575 1726867682.30862: done getting the remaining hosts for this loop 30575 1726867682.30866: getting the next task for host managed_node3 30575 1726867682.30880: done getting next task for host managed_node3 30575 1726867682.30884: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867682.30888: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.30920: getting variables 30575 1726867682.30922: in VariableManager get_vars() 30575 1726867682.30972: Calling all_inventory to load vars for managed_node3 30575 1726867682.30974: Calling groups_inventory to load vars for managed_node3 30575 1726867682.30976: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.30990: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.30993: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.31001: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.31926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.32800: done with get_vars() 30575 1726867682.32819: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:28:02 -0400 (0:00:00.060) 0:01:57.706 ****** 30575 1726867682.32890: entering _queue_task() for managed_node3/stat 30575 1726867682.33146: worker is 1 (out of 1 available) 30575 1726867682.33159: exiting _queue_task() for managed_node3/stat 30575 1726867682.33173: done queuing things up, now waiting for results queue to drain 30575 1726867682.33175: waiting for pending results... 30575 1726867682.33359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30575 1726867682.33456: in run() - task 0affcac9-a3a5-e081-a588-0000000026ed 30575 1726867682.33469: variable 'ansible_search_path' from source: unknown 30575 1726867682.33473: variable 'ansible_search_path' from source: unknown 30575 1726867682.33503: calling self._execute() 30575 1726867682.33583: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.33588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.33597: variable 'omit' from source: magic vars 30575 1726867682.33871: variable 'ansible_distribution_major_version' from source: facts 30575 1726867682.33881: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867682.33996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867682.34189: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867682.34223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867682.34247: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867682.34273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867682.34334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867682.34352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867682.34369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867682.34389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867682.34460: variable '__network_is_ostree' from source: set_fact 30575 1726867682.34465: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867682.34468: when evaluation is False, skipping this task 30575 1726867682.34472: _execute() done 30575 1726867682.34474: dumping result to json 30575 1726867682.34480: done dumping result, returning 30575 1726867682.34487: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-e081-a588-0000000026ed] 30575 1726867682.34492: sending task result for task 0affcac9-a3a5-e081-a588-0000000026ed 30575 1726867682.34574: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026ed 30575 1726867682.34579: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867682.34657: no more pending results, returning what we have 30575 1726867682.34661: results queue empty 30575 1726867682.34661: checking for any_errors_fatal 30575 1726867682.34667: done checking for any_errors_fatal 30575 1726867682.34668: checking for max_fail_percentage 30575 1726867682.34670: done checking for max_fail_percentage 30575 1726867682.34670: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.34672: done checking to see if all hosts have failed 30575 1726867682.34672: getting the remaining hosts for this loop 30575 1726867682.34673: done getting the remaining hosts for this loop 30575 1726867682.34676: getting the next task for host managed_node3 30575 1726867682.34686: done getting next task for host managed_node3 30575 1726867682.34690: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867682.34695: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.34719: getting variables 30575 1726867682.34720: in VariableManager get_vars() 30575 1726867682.34758: Calling all_inventory to load vars for managed_node3 30575 1726867682.34760: Calling groups_inventory to load vars for managed_node3 30575 1726867682.34762: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.34770: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.34772: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.34775: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.35535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.36533: done with get_vars() 30575 1726867682.36548: done getting variables 30575 1726867682.36590: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:28:02 -0400 (0:00:00.037) 0:01:57.743 ****** 30575 1726867682.36616: entering _queue_task() for managed_node3/set_fact 30575 1726867682.36858: worker is 1 (out of 1 available) 30575 1726867682.36873: exiting _queue_task() for managed_node3/set_fact 30575 1726867682.36889: done queuing things up, now waiting for results queue to drain 30575 1726867682.36891: waiting for pending results... 30575 1726867682.37075: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30575 1726867682.37175: in run() - task 0affcac9-a3a5-e081-a588-0000000026ee 30575 1726867682.37189: variable 'ansible_search_path' from source: unknown 30575 1726867682.37192: variable 'ansible_search_path' from source: unknown 30575 1726867682.37226: calling self._execute() 30575 1726867682.37296: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.37300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.37309: variable 'omit' from source: magic vars 30575 1726867682.37591: variable 'ansible_distribution_major_version' from source: facts 30575 1726867682.37600: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867682.37715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867682.37910: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867682.37944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867682.37969: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867682.37996: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867682.38057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867682.38075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867682.38098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867682.38115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867682.38188: variable '__network_is_ostree' from source: set_fact 30575 1726867682.38192: Evaluated conditional (not __network_is_ostree is defined): False 30575 1726867682.38196: when evaluation is False, skipping this task 30575 1726867682.38198: _execute() done 30575 1726867682.38201: dumping result to json 30575 1726867682.38209: done dumping result, returning 30575 1726867682.38212: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-e081-a588-0000000026ee] 30575 1726867682.38220: sending task result for task 0affcac9-a3a5-e081-a588-0000000026ee 30575 1726867682.38296: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026ee 30575 1726867682.38299: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30575 1726867682.38360: no more pending results, returning what we have 30575 1726867682.38364: results queue empty 30575 1726867682.38364: checking for any_errors_fatal 30575 1726867682.38372: done checking for any_errors_fatal 30575 1726867682.38372: checking for max_fail_percentage 30575 1726867682.38374: done checking for max_fail_percentage 30575 1726867682.38374: checking to see if all hosts have failed and the running result is not ok 30575 1726867682.38375: done checking to see if all hosts have failed 30575 1726867682.38376: getting the remaining hosts for this loop 30575 1726867682.38379: done getting the remaining hosts for this loop 30575 1726867682.38383: getting the next task for host managed_node3 30575 1726867682.38395: done getting next task for host managed_node3 30575 1726867682.38399: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867682.38404: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867682.38430: getting variables 30575 1726867682.38432: in VariableManager get_vars() 30575 1726867682.38473: Calling all_inventory to load vars for managed_node3 30575 1726867682.38475: Calling groups_inventory to load vars for managed_node3 30575 1726867682.38482: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867682.38491: Calling all_plugins_play to load vars for managed_node3 30575 1726867682.38493: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867682.38496: Calling groups_plugins_play to load vars for managed_node3 30575 1726867682.39271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867682.40153: done with get_vars() 30575 1726867682.40168: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:28:02 -0400 (0:00:00.036) 0:01:57.779 ****** 30575 1726867682.40238: entering _queue_task() for managed_node3/service_facts 30575 1726867682.40460: worker is 1 (out of 1 available) 30575 1726867682.40476: exiting _queue_task() for managed_node3/service_facts 30575 1726867682.40490: done queuing things up, now waiting for results queue to drain 30575 1726867682.40492: waiting for pending results... 30575 1726867682.40671: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30575 1726867682.40755: in run() - task 0affcac9-a3a5-e081-a588-0000000026f0 30575 1726867682.40767: variable 'ansible_search_path' from source: unknown 30575 1726867682.40771: variable 'ansible_search_path' from source: unknown 30575 1726867682.40800: calling self._execute() 30575 1726867682.40879: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.40884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.40893: variable 'omit' from source: magic vars 30575 1726867682.41166: variable 'ansible_distribution_major_version' from source: facts 30575 1726867682.41175: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867682.41182: variable 'omit' from source: magic vars 30575 1726867682.41231: variable 'omit' from source: magic vars 30575 1726867682.41253: variable 'omit' from source: magic vars 30575 1726867682.41288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867682.41313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867682.41330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867682.41342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867682.41352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867682.41379: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867682.41383: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.41386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.41452: Set connection var ansible_pipelining to False 30575 1726867682.41455: Set connection var ansible_shell_type to sh 30575 1726867682.41460: Set connection var ansible_shell_executable to /bin/sh 30575 1726867682.41465: Set connection var ansible_timeout to 10 30575 1726867682.41470: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867682.41482: Set connection var ansible_connection to ssh 30575 1726867682.41498: variable 'ansible_shell_executable' from source: unknown 30575 1726867682.41500: variable 'ansible_connection' from source: unknown 30575 1726867682.41503: variable 'ansible_module_compression' from source: unknown 30575 1726867682.41505: variable 'ansible_shell_type' from source: unknown 30575 1726867682.41508: variable 'ansible_shell_executable' from source: unknown 30575 1726867682.41510: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867682.41515: variable 'ansible_pipelining' from source: unknown 30575 1726867682.41520: variable 'ansible_timeout' from source: unknown 30575 1726867682.41522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867682.41658: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867682.41667: variable 'omit' from source: magic vars 30575 1726867682.41672: starting attempt loop 30575 1726867682.41675: running the handler 30575 1726867682.41689: _low_level_execute_command(): starting 30575 1726867682.41696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867682.42207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867682.42211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.42215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867682.42217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.42267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867682.42272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867682.42275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867682.42330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867682.44015: stdout chunk (state=3): >>>/root <<< 30575 1726867682.44120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867682.44147: stderr chunk (state=3): >>><<< 30575 1726867682.44151: stdout chunk (state=3): >>><<< 30575 1726867682.44169: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867682.44182: _low_level_execute_command(): starting 30575 1726867682.44188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511 `" && echo ansible-tmp-1726867682.4416769-35703-11393514209511="` echo /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511 `" ) && sleep 0' 30575 1726867682.44614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867682.44620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867682.44623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.44633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867682.44635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.44674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867682.44679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867682.44732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867682.46597: stdout chunk (state=3): >>>ansible-tmp-1726867682.4416769-35703-11393514209511=/root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511 <<< 30575 1726867682.46704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867682.46733: stderr chunk (state=3): >>><<< 30575 1726867682.46736: stdout chunk (state=3): >>><<< 30575 1726867682.46749: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867682.4416769-35703-11393514209511=/root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867682.46792: variable 'ansible_module_compression' from source: unknown 30575 1726867682.46825: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30575 1726867682.46858: variable 'ansible_facts' from source: unknown 30575 1726867682.46920: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/AnsiballZ_service_facts.py 30575 1726867682.47014: Sending initial data 30575 1726867682.47017: Sent initial data (161 bytes) 30575 1726867682.47466: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867682.47470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867682.47472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.47474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867682.47476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.47528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867682.47535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867682.47580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867682.49114: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867682.49119: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867682.49153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867682.49204: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7kttloxd /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/AnsiballZ_service_facts.py <<< 30575 1726867682.49206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/AnsiballZ_service_facts.py" <<< 30575 1726867682.49241: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp7kttloxd" to remote "/root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/AnsiballZ_service_facts.py" <<< 30575 1726867682.49769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867682.49807: stderr chunk (state=3): >>><<< 30575 1726867682.49811: stdout chunk (state=3): >>><<< 30575 1726867682.49869: done transferring module to remote 30575 1726867682.49879: _low_level_execute_command(): starting 30575 1726867682.49883: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/ /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/AnsiballZ_service_facts.py && sleep 0' 30575 1726867682.50309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867682.50312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867682.50314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.50316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867682.50322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.50367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867682.50370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867682.50419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867682.52144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867682.52168: stderr chunk (state=3): >>><<< 30575 1726867682.52171: stdout chunk (state=3): >>><<< 30575 1726867682.52185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867682.52188: _low_level_execute_command(): starting 30575 1726867682.52191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/AnsiballZ_service_facts.py && sleep 0' 30575 1726867682.52583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867682.52589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.52605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867682.52652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867682.52655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867682.52713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.04243: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30575 1726867684.04262: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30575 1726867684.04295: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30575 1726867684.04314: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state<<< 30575 1726867684.04318: stdout chunk (state=3): >>>": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30575 1726867684.05840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867684.05870: stderr chunk (state=3): >>><<< 30575 1726867684.05873: stdout chunk (state=3): >>><<< 30575 1726867684.05907: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867684.06359: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867684.06368: _low_level_execute_command(): starting 30575 1726867684.06371: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867682.4416769-35703-11393514209511/ > /dev/null 2>&1 && sleep 0' 30575 1726867684.06808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867684.06812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.06828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.06883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867684.06886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867684.06898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867684.06939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.08772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867684.08797: stderr chunk (state=3): >>><<< 30575 1726867684.08801: stdout chunk (state=3): >>><<< 30575 1726867684.08812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867684.08817: handler run complete 30575 1726867684.08930: variable 'ansible_facts' from source: unknown 30575 1726867684.09027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.09559: variable 'ansible_facts' from source: unknown 30575 1726867684.09636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.09750: attempt loop complete, returning result 30575 1726867684.09753: _execute() done 30575 1726867684.09756: dumping result to json 30575 1726867684.09793: done dumping result, returning 30575 1726867684.09801: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-e081-a588-0000000026f0] 30575 1726867684.09806: sending task result for task 0affcac9-a3a5-e081-a588-0000000026f0 30575 1726867684.10570: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026f0 30575 1726867684.10573: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867684.10629: no more pending results, returning what we have 30575 1726867684.10631: results queue empty 30575 1726867684.10631: checking for any_errors_fatal 30575 1726867684.10634: done checking for any_errors_fatal 30575 1726867684.10634: checking for max_fail_percentage 30575 1726867684.10635: done checking for max_fail_percentage 30575 1726867684.10636: checking to see if all hosts have failed and the running result is not ok 30575 1726867684.10636: done checking to see if all hosts have failed 30575 1726867684.10637: getting the remaining hosts for this loop 30575 1726867684.10637: done getting the remaining hosts for this loop 30575 1726867684.10640: getting the next task for host managed_node3 30575 1726867684.10644: done getting next task for host managed_node3 30575 1726867684.10646: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867684.10650: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867684.10658: getting variables 30575 1726867684.10658: in VariableManager get_vars() 30575 1726867684.10684: Calling all_inventory to load vars for managed_node3 30575 1726867684.10686: Calling groups_inventory to load vars for managed_node3 30575 1726867684.10688: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867684.10694: Calling all_plugins_play to load vars for managed_node3 30575 1726867684.10696: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867684.10697: Calling groups_plugins_play to load vars for managed_node3 30575 1726867684.11388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.12254: done with get_vars() 30575 1726867684.12269: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:28:04 -0400 (0:00:01.721) 0:01:59.500 ****** 30575 1726867684.12342: entering _queue_task() for managed_node3/package_facts 30575 1726867684.12567: worker is 1 (out of 1 available) 30575 1726867684.12584: exiting _queue_task() for managed_node3/package_facts 30575 1726867684.12597: done queuing things up, now waiting for results queue to drain 30575 1726867684.12599: waiting for pending results... 30575 1726867684.12788: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30575 1726867684.12889: in run() - task 0affcac9-a3a5-e081-a588-0000000026f1 30575 1726867684.12903: variable 'ansible_search_path' from source: unknown 30575 1726867684.12907: variable 'ansible_search_path' from source: unknown 30575 1726867684.12937: calling self._execute() 30575 1726867684.13012: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.13016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.13025: variable 'omit' from source: magic vars 30575 1726867684.13297: variable 'ansible_distribution_major_version' from source: facts 30575 1726867684.13306: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867684.13312: variable 'omit' from source: magic vars 30575 1726867684.13364: variable 'omit' from source: magic vars 30575 1726867684.13388: variable 'omit' from source: magic vars 30575 1726867684.13421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867684.13445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867684.13460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867684.13472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867684.13486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867684.13508: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867684.13511: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.13514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.13580: Set connection var ansible_pipelining to False 30575 1726867684.13584: Set connection var ansible_shell_type to sh 30575 1726867684.13587: Set connection var ansible_shell_executable to /bin/sh 30575 1726867684.13594: Set connection var ansible_timeout to 10 30575 1726867684.13597: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867684.13604: Set connection var ansible_connection to ssh 30575 1726867684.13622: variable 'ansible_shell_executable' from source: unknown 30575 1726867684.13625: variable 'ansible_connection' from source: unknown 30575 1726867684.13628: variable 'ansible_module_compression' from source: unknown 30575 1726867684.13630: variable 'ansible_shell_type' from source: unknown 30575 1726867684.13633: variable 'ansible_shell_executable' from source: unknown 30575 1726867684.13635: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.13637: variable 'ansible_pipelining' from source: unknown 30575 1726867684.13642: variable 'ansible_timeout' from source: unknown 30575 1726867684.13644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.13783: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867684.13792: variable 'omit' from source: magic vars 30575 1726867684.13797: starting attempt loop 30575 1726867684.13800: running the handler 30575 1726867684.13812: _low_level_execute_command(): starting 30575 1726867684.13822: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867684.14323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.14327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.14330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.14332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.14383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867684.14386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867684.14401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867684.14447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.16132: stdout chunk (state=3): >>>/root <<< 30575 1726867684.16232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867684.16256: stderr chunk (state=3): >>><<< 30575 1726867684.16260: stdout chunk (state=3): >>><<< 30575 1726867684.16280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867684.16291: _low_level_execute_command(): starting 30575 1726867684.16296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711 `" && echo ansible-tmp-1726867684.162779-35717-138309101962711="` echo /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711 `" ) && sleep 0' 30575 1726867684.16703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.16706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.16715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867684.16721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867684.16724: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.16762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867684.16766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867684.16815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.18732: stdout chunk (state=3): >>>ansible-tmp-1726867684.162779-35717-138309101962711=/root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711 <<< 30575 1726867684.18842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867684.18864: stderr chunk (state=3): >>><<< 30575 1726867684.18868: stdout chunk (state=3): >>><<< 30575 1726867684.18882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867684.162779-35717-138309101962711=/root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867684.18913: variable 'ansible_module_compression' from source: unknown 30575 1726867684.18949: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30575 1726867684.19000: variable 'ansible_facts' from source: unknown 30575 1726867684.19115: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/AnsiballZ_package_facts.py 30575 1726867684.19205: Sending initial data 30575 1726867684.19209: Sent initial data (161 bytes) 30575 1726867684.19629: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867684.19633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867684.19635: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.19637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.19639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.19682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867684.19696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867684.19738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.21286: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867684.21293: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867684.21329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867684.21370: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_1yppwh6 /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/AnsiballZ_package_facts.py <<< 30575 1726867684.21379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/AnsiballZ_package_facts.py" <<< 30575 1726867684.21420: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_1yppwh6" to remote "/root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/AnsiballZ_package_facts.py" <<< 30575 1726867684.21423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/AnsiballZ_package_facts.py" <<< 30575 1726867684.22439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867684.22473: stderr chunk (state=3): >>><<< 30575 1726867684.22479: stdout chunk (state=3): >>><<< 30575 1726867684.22513: done transferring module to remote 30575 1726867684.22524: _low_level_execute_command(): starting 30575 1726867684.22527: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/ /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/AnsiballZ_package_facts.py && sleep 0' 30575 1726867684.22947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.22951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.22953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.22959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.23004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867684.23008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867684.23056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.24792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867684.24812: stderr chunk (state=3): >>><<< 30575 1726867684.24815: stdout chunk (state=3): >>><<< 30575 1726867684.24828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867684.24832: _low_level_execute_command(): starting 30575 1726867684.24834: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/AnsiballZ_package_facts.py && sleep 0' 30575 1726867684.25245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.25248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867684.25251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.25253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.25255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.25304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867684.25307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867684.25359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.69638: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30575 1726867684.69669: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30575 1726867684.69709: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30575 1726867684.69730: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30575 1726867684.69739: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30575 1726867684.69742: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30575 1726867684.69772: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30575 1726867684.69788: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30575 1726867684.69792: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30575 1726867684.69824: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30575 1726867684.71548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867684.71576: stderr chunk (state=3): >>><<< 30575 1726867684.71581: stdout chunk (state=3): >>><<< 30575 1726867684.71620: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867684.72964: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867684.72981: _low_level_execute_command(): starting 30575 1726867684.72986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867684.162779-35717-138309101962711/ > /dev/null 2>&1 && sleep 0' 30575 1726867684.73435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867684.73438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867684.73440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.73442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867684.73444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867684.73446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867684.73499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867684.73505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867684.73549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867684.75364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867684.75389: stderr chunk (state=3): >>><<< 30575 1726867684.75392: stdout chunk (state=3): >>><<< 30575 1726867684.75402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867684.75408: handler run complete 30575 1726867684.75849: variable 'ansible_facts' from source: unknown 30575 1726867684.76120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.77214: variable 'ansible_facts' from source: unknown 30575 1726867684.77457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.77834: attempt loop complete, returning result 30575 1726867684.77844: _execute() done 30575 1726867684.77847: dumping result to json 30575 1726867684.77962: done dumping result, returning 30575 1726867684.77969: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-e081-a588-0000000026f1] 30575 1726867684.77974: sending task result for task 0affcac9-a3a5-e081-a588-0000000026f1 30575 1726867684.79239: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026f1 30575 1726867684.79242: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867684.79333: no more pending results, returning what we have 30575 1726867684.79336: results queue empty 30575 1726867684.79336: checking for any_errors_fatal 30575 1726867684.79339: done checking for any_errors_fatal 30575 1726867684.79340: checking for max_fail_percentage 30575 1726867684.79341: done checking for max_fail_percentage 30575 1726867684.79341: checking to see if all hosts have failed and the running result is not ok 30575 1726867684.79342: done checking to see if all hosts have failed 30575 1726867684.79342: getting the remaining hosts for this loop 30575 1726867684.79343: done getting the remaining hosts for this loop 30575 1726867684.79346: getting the next task for host managed_node3 30575 1726867684.79351: done getting next task for host managed_node3 30575 1726867684.79353: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867684.79358: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867684.79366: getting variables 30575 1726867684.79367: in VariableManager get_vars() 30575 1726867684.79396: Calling all_inventory to load vars for managed_node3 30575 1726867684.79398: Calling groups_inventory to load vars for managed_node3 30575 1726867684.79399: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867684.79406: Calling all_plugins_play to load vars for managed_node3 30575 1726867684.79407: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867684.79409: Calling groups_plugins_play to load vars for managed_node3 30575 1726867684.80105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.80961: done with get_vars() 30575 1726867684.80983: done getting variables 30575 1726867684.81024: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:28:04 -0400 (0:00:00.687) 0:02:00.188 ****** 30575 1726867684.81048: entering _queue_task() for managed_node3/debug 30575 1726867684.81265: worker is 1 (out of 1 available) 30575 1726867684.81282: exiting _queue_task() for managed_node3/debug 30575 1726867684.81294: done queuing things up, now waiting for results queue to drain 30575 1726867684.81295: waiting for pending results... 30575 1726867684.81487: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30575 1726867684.81580: in run() - task 0affcac9-a3a5-e081-a588-000000002695 30575 1726867684.81593: variable 'ansible_search_path' from source: unknown 30575 1726867684.81597: variable 'ansible_search_path' from source: unknown 30575 1726867684.81632: calling self._execute() 30575 1726867684.81706: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.81710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.81718: variable 'omit' from source: magic vars 30575 1726867684.82014: variable 'ansible_distribution_major_version' from source: facts 30575 1726867684.82026: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867684.82032: variable 'omit' from source: magic vars 30575 1726867684.82074: variable 'omit' from source: magic vars 30575 1726867684.82148: variable 'network_provider' from source: set_fact 30575 1726867684.82161: variable 'omit' from source: magic vars 30575 1726867684.82196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867684.82224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867684.82239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867684.82252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867684.82263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867684.82291: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867684.82295: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.82297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.82364: Set connection var ansible_pipelining to False 30575 1726867684.82367: Set connection var ansible_shell_type to sh 30575 1726867684.82370: Set connection var ansible_shell_executable to /bin/sh 30575 1726867684.82376: Set connection var ansible_timeout to 10 30575 1726867684.82386: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867684.82393: Set connection var ansible_connection to ssh 30575 1726867684.82410: variable 'ansible_shell_executable' from source: unknown 30575 1726867684.82414: variable 'ansible_connection' from source: unknown 30575 1726867684.82416: variable 'ansible_module_compression' from source: unknown 30575 1726867684.82418: variable 'ansible_shell_type' from source: unknown 30575 1726867684.82423: variable 'ansible_shell_executable' from source: unknown 30575 1726867684.82425: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.82429: variable 'ansible_pipelining' from source: unknown 30575 1726867684.82432: variable 'ansible_timeout' from source: unknown 30575 1726867684.82436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.82546: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867684.82554: variable 'omit' from source: magic vars 30575 1726867684.82559: starting attempt loop 30575 1726867684.82562: running the handler 30575 1726867684.82599: handler run complete 30575 1726867684.82611: attempt loop complete, returning result 30575 1726867684.82614: _execute() done 30575 1726867684.82617: dumping result to json 30575 1726867684.82622: done dumping result, returning 30575 1726867684.82630: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-e081-a588-000000002695] 30575 1726867684.82635: sending task result for task 0affcac9-a3a5-e081-a588-000000002695 30575 1726867684.82709: done sending task result for task 0affcac9-a3a5-e081-a588-000000002695 30575 1726867684.82712: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30575 1726867684.82786: no more pending results, returning what we have 30575 1726867684.82789: results queue empty 30575 1726867684.82790: checking for any_errors_fatal 30575 1726867684.82799: done checking for any_errors_fatal 30575 1726867684.82800: checking for max_fail_percentage 30575 1726867684.82801: done checking for max_fail_percentage 30575 1726867684.82802: checking to see if all hosts have failed and the running result is not ok 30575 1726867684.82803: done checking to see if all hosts have failed 30575 1726867684.82804: getting the remaining hosts for this loop 30575 1726867684.82805: done getting the remaining hosts for this loop 30575 1726867684.82808: getting the next task for host managed_node3 30575 1726867684.82815: done getting next task for host managed_node3 30575 1726867684.82819: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867684.82823: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867684.82834: getting variables 30575 1726867684.82836: in VariableManager get_vars() 30575 1726867684.82874: Calling all_inventory to load vars for managed_node3 30575 1726867684.82876: Calling groups_inventory to load vars for managed_node3 30575 1726867684.82883: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867684.82891: Calling all_plugins_play to load vars for managed_node3 30575 1726867684.82893: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867684.82895: Calling groups_plugins_play to load vars for managed_node3 30575 1726867684.83621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.84582: done with get_vars() 30575 1726867684.84597: done getting variables 30575 1726867684.84639: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:28:04 -0400 (0:00:00.036) 0:02:00.224 ****** 30575 1726867684.84667: entering _queue_task() for managed_node3/fail 30575 1726867684.84867: worker is 1 (out of 1 available) 30575 1726867684.84883: exiting _queue_task() for managed_node3/fail 30575 1726867684.84896: done queuing things up, now waiting for results queue to drain 30575 1726867684.84898: waiting for pending results... 30575 1726867684.85082: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30575 1726867684.85170: in run() - task 0affcac9-a3a5-e081-a588-000000002696 30575 1726867684.85186: variable 'ansible_search_path' from source: unknown 30575 1726867684.85191: variable 'ansible_search_path' from source: unknown 30575 1726867684.85221: calling self._execute() 30575 1726867684.85298: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.85302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.85310: variable 'omit' from source: magic vars 30575 1726867684.85594: variable 'ansible_distribution_major_version' from source: facts 30575 1726867684.85602: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867684.85688: variable 'network_state' from source: role '' defaults 30575 1726867684.85698: Evaluated conditional (network_state != {}): False 30575 1726867684.85701: when evaluation is False, skipping this task 30575 1726867684.85704: _execute() done 30575 1726867684.85707: dumping result to json 30575 1726867684.85709: done dumping result, returning 30575 1726867684.85716: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-e081-a588-000000002696] 30575 1726867684.85722: sending task result for task 0affcac9-a3a5-e081-a588-000000002696 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867684.85854: no more pending results, returning what we have 30575 1726867684.85857: results queue empty 30575 1726867684.85858: checking for any_errors_fatal 30575 1726867684.85865: done checking for any_errors_fatal 30575 1726867684.85866: checking for max_fail_percentage 30575 1726867684.85867: done checking for max_fail_percentage 30575 1726867684.85868: checking to see if all hosts have failed and the running result is not ok 30575 1726867684.85869: done checking to see if all hosts have failed 30575 1726867684.85870: getting the remaining hosts for this loop 30575 1726867684.85871: done getting the remaining hosts for this loop 30575 1726867684.85874: getting the next task for host managed_node3 30575 1726867684.85884: done getting next task for host managed_node3 30575 1726867684.85889: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867684.85893: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867684.85915: getting variables 30575 1726867684.85919: in VariableManager get_vars() 30575 1726867684.85956: Calling all_inventory to load vars for managed_node3 30575 1726867684.85958: Calling groups_inventory to load vars for managed_node3 30575 1726867684.85960: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867684.85968: Calling all_plugins_play to load vars for managed_node3 30575 1726867684.85970: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867684.85972: Calling groups_plugins_play to load vars for managed_node3 30575 1726867684.85986: done sending task result for task 0affcac9-a3a5-e081-a588-000000002696 30575 1726867684.85989: WORKER PROCESS EXITING 30575 1726867684.86716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.87581: done with get_vars() 30575 1726867684.87596: done getting variables 30575 1726867684.87638: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:28:04 -0400 (0:00:00.029) 0:02:00.254 ****** 30575 1726867684.87660: entering _queue_task() for managed_node3/fail 30575 1726867684.87854: worker is 1 (out of 1 available) 30575 1726867684.87869: exiting _queue_task() for managed_node3/fail 30575 1726867684.87883: done queuing things up, now waiting for results queue to drain 30575 1726867684.87885: waiting for pending results... 30575 1726867684.88058: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30575 1726867684.88148: in run() - task 0affcac9-a3a5-e081-a588-000000002697 30575 1726867684.88160: variable 'ansible_search_path' from source: unknown 30575 1726867684.88163: variable 'ansible_search_path' from source: unknown 30575 1726867684.88190: calling self._execute() 30575 1726867684.88261: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.88264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.88274: variable 'omit' from source: magic vars 30575 1726867684.88532: variable 'ansible_distribution_major_version' from source: facts 30575 1726867684.88542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867684.88626: variable 'network_state' from source: role '' defaults 30575 1726867684.88635: Evaluated conditional (network_state != {}): False 30575 1726867684.88638: when evaluation is False, skipping this task 30575 1726867684.88641: _execute() done 30575 1726867684.88643: dumping result to json 30575 1726867684.88647: done dumping result, returning 30575 1726867684.88659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-e081-a588-000000002697] 30575 1726867684.88662: sending task result for task 0affcac9-a3a5-e081-a588-000000002697 30575 1726867684.88743: done sending task result for task 0affcac9-a3a5-e081-a588-000000002697 30575 1726867684.88746: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867684.88808: no more pending results, returning what we have 30575 1726867684.88811: results queue empty 30575 1726867684.88812: checking for any_errors_fatal 30575 1726867684.88820: done checking for any_errors_fatal 30575 1726867684.88820: checking for max_fail_percentage 30575 1726867684.88822: done checking for max_fail_percentage 30575 1726867684.88823: checking to see if all hosts have failed and the running result is not ok 30575 1726867684.88823: done checking to see if all hosts have failed 30575 1726867684.88824: getting the remaining hosts for this loop 30575 1726867684.88825: done getting the remaining hosts for this loop 30575 1726867684.88828: getting the next task for host managed_node3 30575 1726867684.88836: done getting next task for host managed_node3 30575 1726867684.88839: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867684.88843: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867684.88866: getting variables 30575 1726867684.88868: in VariableManager get_vars() 30575 1726867684.88904: Calling all_inventory to load vars for managed_node3 30575 1726867684.88906: Calling groups_inventory to load vars for managed_node3 30575 1726867684.88907: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867684.88913: Calling all_plugins_play to load vars for managed_node3 30575 1726867684.88915: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867684.88916: Calling groups_plugins_play to load vars for managed_node3 30575 1726867684.89783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.90625: done with get_vars() 30575 1726867684.90641: done getting variables 30575 1726867684.90680: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:28:04 -0400 (0:00:00.030) 0:02:00.284 ****** 30575 1726867684.90703: entering _queue_task() for managed_node3/fail 30575 1726867684.90901: worker is 1 (out of 1 available) 30575 1726867684.90920: exiting _queue_task() for managed_node3/fail 30575 1726867684.90933: done queuing things up, now waiting for results queue to drain 30575 1726867684.90934: waiting for pending results... 30575 1726867684.91107: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30575 1726867684.91202: in run() - task 0affcac9-a3a5-e081-a588-000000002698 30575 1726867684.91213: variable 'ansible_search_path' from source: unknown 30575 1726867684.91216: variable 'ansible_search_path' from source: unknown 30575 1726867684.91245: calling self._execute() 30575 1726867684.91316: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.91322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.91330: variable 'omit' from source: magic vars 30575 1726867684.91593: variable 'ansible_distribution_major_version' from source: facts 30575 1726867684.91601: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867684.91724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867684.93232: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867684.93288: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867684.93316: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867684.93346: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867684.93365: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867684.93426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867684.93449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867684.93466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867684.93494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867684.93505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867684.93578: variable 'ansible_distribution_major_version' from source: facts 30575 1726867684.93592: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30575 1726867684.93665: variable 'ansible_distribution' from source: facts 30575 1726867684.93669: variable '__network_rh_distros' from source: role '' defaults 30575 1726867684.93679: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30575 1726867684.93835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867684.93852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867684.93868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867684.93898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867684.93909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867684.93944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867684.93960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867684.93976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867684.94004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867684.94014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867684.94045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867684.94061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867684.94079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867684.94106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867684.94116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867684.94308: variable 'network_connections' from source: include params 30575 1726867684.94316: variable 'interface' from source: play vars 30575 1726867684.94364: variable 'interface' from source: play vars 30575 1726867684.94372: variable 'network_state' from source: role '' defaults 30575 1726867684.94421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867684.94531: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867684.94561: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867684.94584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867684.94605: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867684.94650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867684.94664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867684.94687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867684.94705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867684.94726: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30575 1726867684.94730: when evaluation is False, skipping this task 30575 1726867684.94732: _execute() done 30575 1726867684.94735: dumping result to json 30575 1726867684.94737: done dumping result, returning 30575 1726867684.94745: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-e081-a588-000000002698] 30575 1726867684.94751: sending task result for task 0affcac9-a3a5-e081-a588-000000002698 30575 1726867684.94832: done sending task result for task 0affcac9-a3a5-e081-a588-000000002698 30575 1726867684.94835: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30575 1726867684.94909: no more pending results, returning what we have 30575 1726867684.94912: results queue empty 30575 1726867684.94913: checking for any_errors_fatal 30575 1726867684.94920: done checking for any_errors_fatal 30575 1726867684.94921: checking for max_fail_percentage 30575 1726867684.94923: done checking for max_fail_percentage 30575 1726867684.94924: checking to see if all hosts have failed and the running result is not ok 30575 1726867684.94925: done checking to see if all hosts have failed 30575 1726867684.94925: getting the remaining hosts for this loop 30575 1726867684.94928: done getting the remaining hosts for this loop 30575 1726867684.94931: getting the next task for host managed_node3 30575 1726867684.94941: done getting next task for host managed_node3 30575 1726867684.94945: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867684.94949: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867684.94979: getting variables 30575 1726867684.94981: in VariableManager get_vars() 30575 1726867684.95023: Calling all_inventory to load vars for managed_node3 30575 1726867684.95026: Calling groups_inventory to load vars for managed_node3 30575 1726867684.95028: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867684.95037: Calling all_plugins_play to load vars for managed_node3 30575 1726867684.95040: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867684.95042: Calling groups_plugins_play to load vars for managed_node3 30575 1726867684.95847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867684.96710: done with get_vars() 30575 1726867684.96729: done getting variables 30575 1726867684.96768: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:28:04 -0400 (0:00:00.060) 0:02:00.345 ****** 30575 1726867684.96794: entering _queue_task() for managed_node3/dnf 30575 1726867684.97025: worker is 1 (out of 1 available) 30575 1726867684.97039: exiting _queue_task() for managed_node3/dnf 30575 1726867684.97052: done queuing things up, now waiting for results queue to drain 30575 1726867684.97053: waiting for pending results... 30575 1726867684.97246: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30575 1726867684.97336: in run() - task 0affcac9-a3a5-e081-a588-000000002699 30575 1726867684.97348: variable 'ansible_search_path' from source: unknown 30575 1726867684.97352: variable 'ansible_search_path' from source: unknown 30575 1726867684.97384: calling self._execute() 30575 1726867684.97460: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867684.97463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867684.97473: variable 'omit' from source: magic vars 30575 1726867684.97754: variable 'ansible_distribution_major_version' from source: facts 30575 1726867684.97763: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867684.97901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867684.99644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867684.99690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867684.99722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867684.99755: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867684.99774: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867684.99834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867684.99854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867684.99872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867684.99901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867684.99912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867684.99982: variable 'ansible_distribution' from source: facts 30575 1726867684.99985: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.00002: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30575 1726867685.00070: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867685.00154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.00170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.00189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.00214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.00225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.00253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.00269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.00287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.00310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.00322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.00349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.00364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.00382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.00406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.00416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.00513: variable 'network_connections' from source: include params 30575 1726867685.00523: variable 'interface' from source: play vars 30575 1726867685.00569: variable 'interface' from source: play vars 30575 1726867685.00616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867685.00723: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867685.00748: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867685.00771: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867685.00794: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867685.00826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867685.00842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867685.00863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.00884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867685.00925: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867685.01067: variable 'network_connections' from source: include params 30575 1726867685.01071: variable 'interface' from source: play vars 30575 1726867685.01116: variable 'interface' from source: play vars 30575 1726867685.01134: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867685.01137: when evaluation is False, skipping this task 30575 1726867685.01140: _execute() done 30575 1726867685.01142: dumping result to json 30575 1726867685.01145: done dumping result, returning 30575 1726867685.01152: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-000000002699] 30575 1726867685.01157: sending task result for task 0affcac9-a3a5-e081-a588-000000002699 30575 1726867685.01240: done sending task result for task 0affcac9-a3a5-e081-a588-000000002699 30575 1726867685.01243: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867685.01292: no more pending results, returning what we have 30575 1726867685.01295: results queue empty 30575 1726867685.01295: checking for any_errors_fatal 30575 1726867685.01301: done checking for any_errors_fatal 30575 1726867685.01302: checking for max_fail_percentage 30575 1726867685.01304: done checking for max_fail_percentage 30575 1726867685.01305: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.01306: done checking to see if all hosts have failed 30575 1726867685.01307: getting the remaining hosts for this loop 30575 1726867685.01309: done getting the remaining hosts for this loop 30575 1726867685.01312: getting the next task for host managed_node3 30575 1726867685.01322: done getting next task for host managed_node3 30575 1726867685.01326: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867685.01330: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.01358: getting variables 30575 1726867685.01359: in VariableManager get_vars() 30575 1726867685.01408: Calling all_inventory to load vars for managed_node3 30575 1726867685.01410: Calling groups_inventory to load vars for managed_node3 30575 1726867685.01412: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.01423: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.01426: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.01428: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.02316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.03168: done with get_vars() 30575 1726867685.03185: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30575 1726867685.03237: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:28:05 -0400 (0:00:00.064) 0:02:00.410 ****** 30575 1726867685.03261: entering _queue_task() for managed_node3/yum 30575 1726867685.03473: worker is 1 (out of 1 available) 30575 1726867685.03488: exiting _queue_task() for managed_node3/yum 30575 1726867685.03501: done queuing things up, now waiting for results queue to drain 30575 1726867685.03503: waiting for pending results... 30575 1726867685.03681: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30575 1726867685.03781: in run() - task 0affcac9-a3a5-e081-a588-00000000269a 30575 1726867685.03801: variable 'ansible_search_path' from source: unknown 30575 1726867685.03805: variable 'ansible_search_path' from source: unknown 30575 1726867685.03835: calling self._execute() 30575 1726867685.03911: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.03915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.03928: variable 'omit' from source: magic vars 30575 1726867685.04211: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.04221: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867685.04341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867685.05838: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867685.05889: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867685.05918: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867685.05946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867685.05967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867685.06029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.06050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.06068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.06095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.06105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.06173: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.06187: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30575 1726867685.06190: when evaluation is False, skipping this task 30575 1726867685.06193: _execute() done 30575 1726867685.06196: dumping result to json 30575 1726867685.06198: done dumping result, returning 30575 1726867685.06206: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000269a] 30575 1726867685.06211: sending task result for task 0affcac9-a3a5-e081-a588-00000000269a 30575 1726867685.06296: done sending task result for task 0affcac9-a3a5-e081-a588-00000000269a 30575 1726867685.06300: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30575 1726867685.06375: no more pending results, returning what we have 30575 1726867685.06383: results queue empty 30575 1726867685.06384: checking for any_errors_fatal 30575 1726867685.06390: done checking for any_errors_fatal 30575 1726867685.06390: checking for max_fail_percentage 30575 1726867685.06392: done checking for max_fail_percentage 30575 1726867685.06393: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.06394: done checking to see if all hosts have failed 30575 1726867685.06394: getting the remaining hosts for this loop 30575 1726867685.06396: done getting the remaining hosts for this loop 30575 1726867685.06399: getting the next task for host managed_node3 30575 1726867685.06406: done getting next task for host managed_node3 30575 1726867685.06410: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867685.06414: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.06438: getting variables 30575 1726867685.06439: in VariableManager get_vars() 30575 1726867685.06479: Calling all_inventory to load vars for managed_node3 30575 1726867685.06482: Calling groups_inventory to load vars for managed_node3 30575 1726867685.06484: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.06495: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.06498: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.06500: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.07242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.08190: done with get_vars() 30575 1726867685.08205: done getting variables 30575 1726867685.08245: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:28:05 -0400 (0:00:00.050) 0:02:00.460 ****** 30575 1726867685.08268: entering _queue_task() for managed_node3/fail 30575 1726867685.08470: worker is 1 (out of 1 available) 30575 1726867685.08485: exiting _queue_task() for managed_node3/fail 30575 1726867685.08499: done queuing things up, now waiting for results queue to drain 30575 1726867685.08500: waiting for pending results... 30575 1726867685.08681: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30575 1726867685.08779: in run() - task 0affcac9-a3a5-e081-a588-00000000269b 30575 1726867685.08792: variable 'ansible_search_path' from source: unknown 30575 1726867685.08796: variable 'ansible_search_path' from source: unknown 30575 1726867685.08825: calling self._execute() 30575 1726867685.08901: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.08906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.08914: variable 'omit' from source: magic vars 30575 1726867685.09190: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.09198: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867685.09281: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867685.09409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867685.15856: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867685.15895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867685.15922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867685.15948: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867685.15976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867685.16028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.16049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.16068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.16095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.16106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.16143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.16158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.16174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.16200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.16211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.16240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.16258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.16274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.16299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.16310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.16418: variable 'network_connections' from source: include params 30575 1726867685.16428: variable 'interface' from source: play vars 30575 1726867685.16478: variable 'interface' from source: play vars 30575 1726867685.16525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867685.16626: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867685.16652: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867685.16675: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867685.16696: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867685.16728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867685.16743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867685.16759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.16776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867685.16807: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867685.16962: variable 'network_connections' from source: include params 30575 1726867685.16966: variable 'interface' from source: play vars 30575 1726867685.17010: variable 'interface' from source: play vars 30575 1726867685.17031: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867685.17034: when evaluation is False, skipping this task 30575 1726867685.17037: _execute() done 30575 1726867685.17039: dumping result to json 30575 1726867685.17042: done dumping result, returning 30575 1726867685.17047: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000269b] 30575 1726867685.17049: sending task result for task 0affcac9-a3a5-e081-a588-00000000269b 30575 1726867685.17137: done sending task result for task 0affcac9-a3a5-e081-a588-00000000269b 30575 1726867685.17140: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867685.17184: no more pending results, returning what we have 30575 1726867685.17187: results queue empty 30575 1726867685.17188: checking for any_errors_fatal 30575 1726867685.17195: done checking for any_errors_fatal 30575 1726867685.17196: checking for max_fail_percentage 30575 1726867685.17197: done checking for max_fail_percentage 30575 1726867685.17198: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.17199: done checking to see if all hosts have failed 30575 1726867685.17199: getting the remaining hosts for this loop 30575 1726867685.17201: done getting the remaining hosts for this loop 30575 1726867685.17204: getting the next task for host managed_node3 30575 1726867685.17211: done getting next task for host managed_node3 30575 1726867685.17214: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30575 1726867685.17220: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.17245: getting variables 30575 1726867685.17246: in VariableManager get_vars() 30575 1726867685.17291: Calling all_inventory to load vars for managed_node3 30575 1726867685.17294: Calling groups_inventory to load vars for managed_node3 30575 1726867685.17296: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.17304: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.17306: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.17309: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.22535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.23381: done with get_vars() 30575 1726867685.23398: done getting variables 30575 1726867685.23434: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:28:05 -0400 (0:00:00.151) 0:02:00.612 ****** 30575 1726867685.23458: entering _queue_task() for managed_node3/package 30575 1726867685.23744: worker is 1 (out of 1 available) 30575 1726867685.23760: exiting _queue_task() for managed_node3/package 30575 1726867685.23774: done queuing things up, now waiting for results queue to drain 30575 1726867685.23778: waiting for pending results... 30575 1726867685.23981: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30575 1726867685.24102: in run() - task 0affcac9-a3a5-e081-a588-00000000269c 30575 1726867685.24116: variable 'ansible_search_path' from source: unknown 30575 1726867685.24124: variable 'ansible_search_path' from source: unknown 30575 1726867685.24153: calling self._execute() 30575 1726867685.24236: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.24241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.24250: variable 'omit' from source: magic vars 30575 1726867685.24538: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.24548: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867685.24687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867685.24889: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867685.24926: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867685.24997: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867685.25026: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867685.25124: variable 'network_packages' from source: role '' defaults 30575 1726867685.25193: variable '__network_provider_setup' from source: role '' defaults 30575 1726867685.25202: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867685.25249: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867685.25256: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867685.25301: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867685.25415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867685.26754: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867685.26802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867685.26828: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867685.26854: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867685.26884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867685.26943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.26966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.26985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.27011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.27064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.27068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.27070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.27086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.27111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.27172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.27263: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867685.27335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.27352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.27369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.27401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.27412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.27470: variable 'ansible_python' from source: facts 30575 1726867685.27485: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867685.27541: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867685.27597: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867685.27678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.27695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.27715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.27739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.27750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.27782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.27802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.27826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.27846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.27856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.27953: variable 'network_connections' from source: include params 30575 1726867685.27959: variable 'interface' from source: play vars 30575 1726867685.28027: variable 'interface' from source: play vars 30575 1726867685.28079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867685.28098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867685.28121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.28140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867685.28181: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867685.28350: variable 'network_connections' from source: include params 30575 1726867685.28354: variable 'interface' from source: play vars 30575 1726867685.28424: variable 'interface' from source: play vars 30575 1726867685.28447: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867685.28502: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867685.28694: variable 'network_connections' from source: include params 30575 1726867685.28698: variable 'interface' from source: play vars 30575 1726867685.28740: variable 'interface' from source: play vars 30575 1726867685.28756: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867685.28810: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867685.28998: variable 'network_connections' from source: include params 30575 1726867685.29001: variable 'interface' from source: play vars 30575 1726867685.29049: variable 'interface' from source: play vars 30575 1726867685.29086: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867685.29131: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867685.29134: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867685.29174: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867685.29306: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867685.29598: variable 'network_connections' from source: include params 30575 1726867685.29602: variable 'interface' from source: play vars 30575 1726867685.29643: variable 'interface' from source: play vars 30575 1726867685.29649: variable 'ansible_distribution' from source: facts 30575 1726867685.29652: variable '__network_rh_distros' from source: role '' defaults 30575 1726867685.29658: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.29670: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867685.29784: variable 'ansible_distribution' from source: facts 30575 1726867685.29788: variable '__network_rh_distros' from source: role '' defaults 30575 1726867685.29790: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.29803: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867685.29908: variable 'ansible_distribution' from source: facts 30575 1726867685.29912: variable '__network_rh_distros' from source: role '' defaults 30575 1726867685.29914: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.29943: variable 'network_provider' from source: set_fact 30575 1726867685.29953: variable 'ansible_facts' from source: unknown 30575 1726867685.30502: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30575 1726867685.30505: when evaluation is False, skipping this task 30575 1726867685.30508: _execute() done 30575 1726867685.30510: dumping result to json 30575 1726867685.30512: done dumping result, returning 30575 1726867685.30519: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-e081-a588-00000000269c] 30575 1726867685.30527: sending task result for task 0affcac9-a3a5-e081-a588-00000000269c 30575 1726867685.30617: done sending task result for task 0affcac9-a3a5-e081-a588-00000000269c 30575 1726867685.30620: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30575 1726867685.30669: no more pending results, returning what we have 30575 1726867685.30672: results queue empty 30575 1726867685.30673: checking for any_errors_fatal 30575 1726867685.30683: done checking for any_errors_fatal 30575 1726867685.30684: checking for max_fail_percentage 30575 1726867685.30685: done checking for max_fail_percentage 30575 1726867685.30686: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.30687: done checking to see if all hosts have failed 30575 1726867685.30688: getting the remaining hosts for this loop 30575 1726867685.30690: done getting the remaining hosts for this loop 30575 1726867685.30694: getting the next task for host managed_node3 30575 1726867685.30702: done getting next task for host managed_node3 30575 1726867685.30705: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867685.30710: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.30743: getting variables 30575 1726867685.30744: in VariableManager get_vars() 30575 1726867685.30803: Calling all_inventory to load vars for managed_node3 30575 1726867685.30806: Calling groups_inventory to load vars for managed_node3 30575 1726867685.30808: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.30817: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.30819: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.30822: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.31684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.32657: done with get_vars() 30575 1726867685.32672: done getting variables 30575 1726867685.32715: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:28:05 -0400 (0:00:00.092) 0:02:00.704 ****** 30575 1726867685.32741: entering _queue_task() for managed_node3/package 30575 1726867685.32980: worker is 1 (out of 1 available) 30575 1726867685.32995: exiting _queue_task() for managed_node3/package 30575 1726867685.33011: done queuing things up, now waiting for results queue to drain 30575 1726867685.33012: waiting for pending results... 30575 1726867685.33201: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30575 1726867685.33314: in run() - task 0affcac9-a3a5-e081-a588-00000000269d 30575 1726867685.33329: variable 'ansible_search_path' from source: unknown 30575 1726867685.33333: variable 'ansible_search_path' from source: unknown 30575 1726867685.33363: calling self._execute() 30575 1726867685.33441: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.33444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.33458: variable 'omit' from source: magic vars 30575 1726867685.33731: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.33741: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867685.33831: variable 'network_state' from source: role '' defaults 30575 1726867685.33839: Evaluated conditional (network_state != {}): False 30575 1726867685.33842: when evaluation is False, skipping this task 30575 1726867685.33845: _execute() done 30575 1726867685.33849: dumping result to json 30575 1726867685.33852: done dumping result, returning 30575 1726867685.33861: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-e081-a588-00000000269d] 30575 1726867685.33865: sending task result for task 0affcac9-a3a5-e081-a588-00000000269d 30575 1726867685.33961: done sending task result for task 0affcac9-a3a5-e081-a588-00000000269d 30575 1726867685.33964: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867685.34036: no more pending results, returning what we have 30575 1726867685.34040: results queue empty 30575 1726867685.34040: checking for any_errors_fatal 30575 1726867685.34045: done checking for any_errors_fatal 30575 1726867685.34045: checking for max_fail_percentage 30575 1726867685.34047: done checking for max_fail_percentage 30575 1726867685.34047: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.34048: done checking to see if all hosts have failed 30575 1726867685.34049: getting the remaining hosts for this loop 30575 1726867685.34050: done getting the remaining hosts for this loop 30575 1726867685.34053: getting the next task for host managed_node3 30575 1726867685.34061: done getting next task for host managed_node3 30575 1726867685.34065: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867685.34069: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.34094: getting variables 30575 1726867685.34096: in VariableManager get_vars() 30575 1726867685.34134: Calling all_inventory to load vars for managed_node3 30575 1726867685.34137: Calling groups_inventory to load vars for managed_node3 30575 1726867685.34139: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.34146: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.34149: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.34151: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.34892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.35761: done with get_vars() 30575 1726867685.35778: done getting variables 30575 1726867685.35820: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:28:05 -0400 (0:00:00.031) 0:02:00.735 ****** 30575 1726867685.35844: entering _queue_task() for managed_node3/package 30575 1726867685.36052: worker is 1 (out of 1 available) 30575 1726867685.36068: exiting _queue_task() for managed_node3/package 30575 1726867685.36084: done queuing things up, now waiting for results queue to drain 30575 1726867685.36085: waiting for pending results... 30575 1726867685.36271: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30575 1726867685.36385: in run() - task 0affcac9-a3a5-e081-a588-00000000269e 30575 1726867685.36397: variable 'ansible_search_path' from source: unknown 30575 1726867685.36401: variable 'ansible_search_path' from source: unknown 30575 1726867685.36432: calling self._execute() 30575 1726867685.36510: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.36513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.36526: variable 'omit' from source: magic vars 30575 1726867685.36797: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.36806: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867685.36895: variable 'network_state' from source: role '' defaults 30575 1726867685.36905: Evaluated conditional (network_state != {}): False 30575 1726867685.36908: when evaluation is False, skipping this task 30575 1726867685.36911: _execute() done 30575 1726867685.36914: dumping result to json 30575 1726867685.36919: done dumping result, returning 30575 1726867685.36926: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-e081-a588-00000000269e] 30575 1726867685.36931: sending task result for task 0affcac9-a3a5-e081-a588-00000000269e 30575 1726867685.37043: done sending task result for task 0affcac9-a3a5-e081-a588-00000000269e 30575 1726867685.37046: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867685.37110: no more pending results, returning what we have 30575 1726867685.37114: results queue empty 30575 1726867685.37115: checking for any_errors_fatal 30575 1726867685.37121: done checking for any_errors_fatal 30575 1726867685.37122: checking for max_fail_percentage 30575 1726867685.37123: done checking for max_fail_percentage 30575 1726867685.37124: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.37125: done checking to see if all hosts have failed 30575 1726867685.37126: getting the remaining hosts for this loop 30575 1726867685.37127: done getting the remaining hosts for this loop 30575 1726867685.37130: getting the next task for host managed_node3 30575 1726867685.37137: done getting next task for host managed_node3 30575 1726867685.37141: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867685.37145: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.37170: getting variables 30575 1726867685.37171: in VariableManager get_vars() 30575 1726867685.37211: Calling all_inventory to load vars for managed_node3 30575 1726867685.37213: Calling groups_inventory to load vars for managed_node3 30575 1726867685.37215: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.37225: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.37228: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.37230: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.38145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.38994: done with get_vars() 30575 1726867685.39008: done getting variables 30575 1726867685.39051: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:28:05 -0400 (0:00:00.032) 0:02:00.768 ****** 30575 1726867685.39076: entering _queue_task() for managed_node3/service 30575 1726867685.39302: worker is 1 (out of 1 available) 30575 1726867685.39320: exiting _queue_task() for managed_node3/service 30575 1726867685.39334: done queuing things up, now waiting for results queue to drain 30575 1726867685.39336: waiting for pending results... 30575 1726867685.39522: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30575 1726867685.39607: in run() - task 0affcac9-a3a5-e081-a588-00000000269f 30575 1726867685.39622: variable 'ansible_search_path' from source: unknown 30575 1726867685.39626: variable 'ansible_search_path' from source: unknown 30575 1726867685.39652: calling self._execute() 30575 1726867685.39732: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.39735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.39745: variable 'omit' from source: magic vars 30575 1726867685.40030: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.40039: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867685.40128: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867685.40260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867685.41738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867685.41790: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867685.41820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867685.41843: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867685.41868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867685.41927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.41948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.41969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.41999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.42010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.42045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.42061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.42086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.42111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.42176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.42182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.42187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.42190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.42213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.42224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.42339: variable 'network_connections' from source: include params 30575 1726867685.42348: variable 'interface' from source: play vars 30575 1726867685.42396: variable 'interface' from source: play vars 30575 1726867685.42447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867685.42556: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867685.42592: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867685.42616: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867685.42639: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867685.42668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867685.42686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867685.42704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.42724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867685.42761: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867685.42910: variable 'network_connections' from source: include params 30575 1726867685.42914: variable 'interface' from source: play vars 30575 1726867685.42962: variable 'interface' from source: play vars 30575 1726867685.42981: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30575 1726867685.42984: when evaluation is False, skipping this task 30575 1726867685.42987: _execute() done 30575 1726867685.42989: dumping result to json 30575 1726867685.42991: done dumping result, returning 30575 1726867685.42999: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-e081-a588-00000000269f] 30575 1726867685.43004: sending task result for task 0affcac9-a3a5-e081-a588-00000000269f 30575 1726867685.43090: done sending task result for task 0affcac9-a3a5-e081-a588-00000000269f 30575 1726867685.43099: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30575 1726867685.43146: no more pending results, returning what we have 30575 1726867685.43149: results queue empty 30575 1726867685.43150: checking for any_errors_fatal 30575 1726867685.43158: done checking for any_errors_fatal 30575 1726867685.43159: checking for max_fail_percentage 30575 1726867685.43160: done checking for max_fail_percentage 30575 1726867685.43161: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.43162: done checking to see if all hosts have failed 30575 1726867685.43163: getting the remaining hosts for this loop 30575 1726867685.43164: done getting the remaining hosts for this loop 30575 1726867685.43167: getting the next task for host managed_node3 30575 1726867685.43175: done getting next task for host managed_node3 30575 1726867685.43181: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867685.43186: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.43219: getting variables 30575 1726867685.43221: in VariableManager get_vars() 30575 1726867685.43263: Calling all_inventory to load vars for managed_node3 30575 1726867685.43266: Calling groups_inventory to load vars for managed_node3 30575 1726867685.43268: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.43278: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.43281: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.43283: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.44073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.44954: done with get_vars() 30575 1726867685.44970: done getting variables 30575 1726867685.45013: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:28:05 -0400 (0:00:00.059) 0:02:00.827 ****** 30575 1726867685.45039: entering _queue_task() for managed_node3/service 30575 1726867685.45283: worker is 1 (out of 1 available) 30575 1726867685.45298: exiting _queue_task() for managed_node3/service 30575 1726867685.45313: done queuing things up, now waiting for results queue to drain 30575 1726867685.45315: waiting for pending results... 30575 1726867685.45497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30575 1726867685.45591: in run() - task 0affcac9-a3a5-e081-a588-0000000026a0 30575 1726867685.45602: variable 'ansible_search_path' from source: unknown 30575 1726867685.45606: variable 'ansible_search_path' from source: unknown 30575 1726867685.45637: calling self._execute() 30575 1726867685.45715: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.45722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.45728: variable 'omit' from source: magic vars 30575 1726867685.46009: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.46020: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867685.46130: variable 'network_provider' from source: set_fact 30575 1726867685.46135: variable 'network_state' from source: role '' defaults 30575 1726867685.46144: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30575 1726867685.46150: variable 'omit' from source: magic vars 30575 1726867685.46189: variable 'omit' from source: magic vars 30575 1726867685.46210: variable 'network_service_name' from source: role '' defaults 30575 1726867685.46260: variable 'network_service_name' from source: role '' defaults 30575 1726867685.46334: variable '__network_provider_setup' from source: role '' defaults 30575 1726867685.46337: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867685.46383: variable '__network_service_name_default_nm' from source: role '' defaults 30575 1726867685.46391: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867685.46437: variable '__network_packages_default_nm' from source: role '' defaults 30575 1726867685.46579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867685.48255: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867685.48309: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867685.48336: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867685.48360: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867685.48383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867685.48439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.48459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.48478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.48507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.48520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.48548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.48564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.48581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.48609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.48622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.48761: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30575 1726867685.48834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.48850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.48867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.48893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.48903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.48967: variable 'ansible_python' from source: facts 30575 1726867685.48980: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30575 1726867685.49034: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867685.49090: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867685.49174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.49194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.49210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.49236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.49251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.49283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867685.49302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867685.49318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.49344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867685.49358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867685.49445: variable 'network_connections' from source: include params 30575 1726867685.49452: variable 'interface' from source: play vars 30575 1726867685.49506: variable 'interface' from source: play vars 30575 1726867685.49581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867685.49704: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867685.49740: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867685.49768: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867685.49801: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867685.49846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867685.49866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867685.49890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867685.49915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867685.49953: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867685.50128: variable 'network_connections' from source: include params 30575 1726867685.50134: variable 'interface' from source: play vars 30575 1726867685.50185: variable 'interface' from source: play vars 30575 1726867685.50207: variable '__network_packages_default_wireless' from source: role '' defaults 30575 1726867685.50264: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867685.50443: variable 'network_connections' from source: include params 30575 1726867685.50446: variable 'interface' from source: play vars 30575 1726867685.50497: variable 'interface' from source: play vars 30575 1726867685.50513: variable '__network_packages_default_team' from source: role '' defaults 30575 1726867685.50569: variable '__network_team_connections_defined' from source: role '' defaults 30575 1726867685.50749: variable 'network_connections' from source: include params 30575 1726867685.50752: variable 'interface' from source: play vars 30575 1726867685.50803: variable 'interface' from source: play vars 30575 1726867685.50840: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867685.50883: variable '__network_service_name_default_initscripts' from source: role '' defaults 30575 1726867685.50888: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867685.50933: variable '__network_packages_default_initscripts' from source: role '' defaults 30575 1726867685.51065: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30575 1726867685.51363: variable 'network_connections' from source: include params 30575 1726867685.51367: variable 'interface' from source: play vars 30575 1726867685.51409: variable 'interface' from source: play vars 30575 1726867685.51416: variable 'ansible_distribution' from source: facts 30575 1726867685.51418: variable '__network_rh_distros' from source: role '' defaults 30575 1726867685.51426: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.51437: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30575 1726867685.51552: variable 'ansible_distribution' from source: facts 30575 1726867685.51555: variable '__network_rh_distros' from source: role '' defaults 30575 1726867685.51558: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.51570: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30575 1726867685.51680: variable 'ansible_distribution' from source: facts 30575 1726867685.51686: variable '__network_rh_distros' from source: role '' defaults 30575 1726867685.51692: variable 'ansible_distribution_major_version' from source: facts 30575 1726867685.51716: variable 'network_provider' from source: set_fact 30575 1726867685.51734: variable 'omit' from source: magic vars 30575 1726867685.51752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867685.51773: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867685.51791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867685.51804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867685.51812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867685.51835: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867685.51838: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.51840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.51908: Set connection var ansible_pipelining to False 30575 1726867685.51911: Set connection var ansible_shell_type to sh 30575 1726867685.51916: Set connection var ansible_shell_executable to /bin/sh 30575 1726867685.51921: Set connection var ansible_timeout to 10 30575 1726867685.51926: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867685.51933: Set connection var ansible_connection to ssh 30575 1726867685.51953: variable 'ansible_shell_executable' from source: unknown 30575 1726867685.51956: variable 'ansible_connection' from source: unknown 30575 1726867685.51958: variable 'ansible_module_compression' from source: unknown 30575 1726867685.51960: variable 'ansible_shell_type' from source: unknown 30575 1726867685.51962: variable 'ansible_shell_executable' from source: unknown 30575 1726867685.51965: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867685.51969: variable 'ansible_pipelining' from source: unknown 30575 1726867685.51971: variable 'ansible_timeout' from source: unknown 30575 1726867685.51976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867685.52044: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867685.52052: variable 'omit' from source: magic vars 30575 1726867685.52057: starting attempt loop 30575 1726867685.52060: running the handler 30575 1726867685.52115: variable 'ansible_facts' from source: unknown 30575 1726867685.52497: _low_level_execute_command(): starting 30575 1726867685.52504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867685.53005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867685.53010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867685.53013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.53067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867685.53071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867685.53073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867685.53131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867685.54813: stdout chunk (state=3): >>>/root <<< 30575 1726867685.54913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867685.54943: stderr chunk (state=3): >>><<< 30575 1726867685.54946: stdout chunk (state=3): >>><<< 30575 1726867685.54963: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867685.54972: _low_level_execute_command(): starting 30575 1726867685.54979: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032 `" && echo ansible-tmp-1726867685.5496182-35739-179530320858032="` echo /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032 `" ) && sleep 0' 30575 1726867685.55381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867685.55385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.55398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.55452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867685.55455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867685.55504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867685.57399: stdout chunk (state=3): >>>ansible-tmp-1726867685.5496182-35739-179530320858032=/root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032 <<< 30575 1726867685.57499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867685.57532: stderr chunk (state=3): >>><<< 30575 1726867685.57535: stdout chunk (state=3): >>><<< 30575 1726867685.57549: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867685.5496182-35739-179530320858032=/root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867685.57580: variable 'ansible_module_compression' from source: unknown 30575 1726867685.57622: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30575 1726867685.57679: variable 'ansible_facts' from source: unknown 30575 1726867685.57817: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/AnsiballZ_systemd.py 30575 1726867685.57927: Sending initial data 30575 1726867685.57931: Sent initial data (156 bytes) 30575 1726867685.58390: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867685.58394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867685.58396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867685.58402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867685.58404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.58454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867685.58461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867685.58503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867685.60034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30575 1726867685.60037: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867685.60078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867685.60125: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp6t0hr97t /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/AnsiballZ_systemd.py <<< 30575 1726867685.60127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/AnsiballZ_systemd.py" <<< 30575 1726867685.60162: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp6t0hr97t" to remote "/root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/AnsiballZ_systemd.py" <<< 30575 1726867685.61225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867685.61264: stderr chunk (state=3): >>><<< 30575 1726867685.61267: stdout chunk (state=3): >>><<< 30575 1726867685.61288: done transferring module to remote 30575 1726867685.61296: _low_level_execute_command(): starting 30575 1726867685.61300: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/ /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/AnsiballZ_systemd.py && sleep 0' 30575 1726867685.61726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867685.61729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867685.61731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.61734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867685.61736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.61782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867685.61795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867685.61834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867685.63557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867685.63579: stderr chunk (state=3): >>><<< 30575 1726867685.63583: stdout chunk (state=3): >>><<< 30575 1726867685.63594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867685.63598: _low_level_execute_command(): starting 30575 1726867685.63600: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/AnsiballZ_systemd.py && sleep 0' 30575 1726867685.64003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867685.64006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867685.64009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867685.64011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867685.64013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.64063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867685.64066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867685.64115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867685.92958: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call or<<< 30575 1726867685.92963: stdout chunk (state=3): >>>g.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10526720", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313762304", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "2052887000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "Privat<<< 30575 1726867685.93003: stdout chunk (state=3): >>>eIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30575 1726867685.94793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867685.94826: stderr chunk (state=3): >>><<< 30575 1726867685.94829: stdout chunk (state=3): >>><<< 30575 1726867685.94845: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10526720", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313762304", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "2052887000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867685.94967: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867685.94986: _low_level_execute_command(): starting 30575 1726867685.94989: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867685.5496182-35739-179530320858032/ > /dev/null 2>&1 && sleep 0' 30575 1726867685.95441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867685.95444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867685.95447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867685.95451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867685.95453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867685.95502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867685.95506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867685.95510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867685.95552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867685.97354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867685.97380: stderr chunk (state=3): >>><<< 30575 1726867685.97384: stdout chunk (state=3): >>><<< 30575 1726867685.97395: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867685.97401: handler run complete 30575 1726867685.97439: attempt loop complete, returning result 30575 1726867685.97442: _execute() done 30575 1726867685.97444: dumping result to json 30575 1726867685.97456: done dumping result, returning 30575 1726867685.97464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-e081-a588-0000000026a0] 30575 1726867685.97469: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a0 30575 1726867685.97704: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a0 30575 1726867685.97706: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867685.97769: no more pending results, returning what we have 30575 1726867685.97772: results queue empty 30575 1726867685.97773: checking for any_errors_fatal 30575 1726867685.97780: done checking for any_errors_fatal 30575 1726867685.97781: checking for max_fail_percentage 30575 1726867685.97782: done checking for max_fail_percentage 30575 1726867685.97783: checking to see if all hosts have failed and the running result is not ok 30575 1726867685.97784: done checking to see if all hosts have failed 30575 1726867685.97785: getting the remaining hosts for this loop 30575 1726867685.97786: done getting the remaining hosts for this loop 30575 1726867685.97789: getting the next task for host managed_node3 30575 1726867685.97797: done getting next task for host managed_node3 30575 1726867685.97800: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867685.97805: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867685.97820: getting variables 30575 1726867685.97822: in VariableManager get_vars() 30575 1726867685.97861: Calling all_inventory to load vars for managed_node3 30575 1726867685.97863: Calling groups_inventory to load vars for managed_node3 30575 1726867685.97865: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867685.97874: Calling all_plugins_play to load vars for managed_node3 30575 1726867685.97876: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867685.97884: Calling groups_plugins_play to load vars for managed_node3 30575 1726867685.98797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867685.99660: done with get_vars() 30575 1726867685.99676: done getting variables 30575 1726867685.99725: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:28:05 -0400 (0:00:00.547) 0:02:01.375 ****** 30575 1726867685.99753: entering _queue_task() for managed_node3/service 30575 1726867685.99991: worker is 1 (out of 1 available) 30575 1726867686.00004: exiting _queue_task() for managed_node3/service 30575 1726867686.00021: done queuing things up, now waiting for results queue to drain 30575 1726867686.00023: waiting for pending results... 30575 1726867686.00208: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30575 1726867686.00299: in run() - task 0affcac9-a3a5-e081-a588-0000000026a1 30575 1726867686.00311: variable 'ansible_search_path' from source: unknown 30575 1726867686.00314: variable 'ansible_search_path' from source: unknown 30575 1726867686.00343: calling self._execute() 30575 1726867686.00424: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.00427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.00434: variable 'omit' from source: magic vars 30575 1726867686.00721: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.00727: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.00809: variable 'network_provider' from source: set_fact 30575 1726867686.00815: Evaluated conditional (network_provider == "nm"): True 30575 1726867686.00883: variable '__network_wpa_supplicant_required' from source: role '' defaults 30575 1726867686.00946: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30575 1726867686.01062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867686.02501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867686.02548: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867686.02574: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867686.02602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867686.02623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867686.02691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867686.02711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867686.02730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867686.02758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867686.02769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867686.02803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867686.02821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867686.02836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867686.02864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867686.02874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867686.02903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867686.02922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867686.02939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867686.02967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867686.02974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867686.03067: variable 'network_connections' from source: include params 30575 1726867686.03076: variable 'interface' from source: play vars 30575 1726867686.03124: variable 'interface' from source: play vars 30575 1726867686.03171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30575 1726867686.03281: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30575 1726867686.03309: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30575 1726867686.03332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30575 1726867686.03354: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30575 1726867686.03384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30575 1726867686.03400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30575 1726867686.03422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867686.03437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30575 1726867686.03474: variable '__network_wireless_connections_defined' from source: role '' defaults 30575 1726867686.03626: variable 'network_connections' from source: include params 30575 1726867686.03629: variable 'interface' from source: play vars 30575 1726867686.03672: variable 'interface' from source: play vars 30575 1726867686.03695: Evaluated conditional (__network_wpa_supplicant_required): False 30575 1726867686.03699: when evaluation is False, skipping this task 30575 1726867686.03701: _execute() done 30575 1726867686.03705: dumping result to json 30575 1726867686.03707: done dumping result, returning 30575 1726867686.03714: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-e081-a588-0000000026a1] 30575 1726867686.03728: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a1 30575 1726867686.03802: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a1 30575 1726867686.03805: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30575 1726867686.03883: no more pending results, returning what we have 30575 1726867686.03887: results queue empty 30575 1726867686.03888: checking for any_errors_fatal 30575 1726867686.03907: done checking for any_errors_fatal 30575 1726867686.03908: checking for max_fail_percentage 30575 1726867686.03910: done checking for max_fail_percentage 30575 1726867686.03911: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.03911: done checking to see if all hosts have failed 30575 1726867686.03912: getting the remaining hosts for this loop 30575 1726867686.03913: done getting the remaining hosts for this loop 30575 1726867686.03919: getting the next task for host managed_node3 30575 1726867686.03928: done getting next task for host managed_node3 30575 1726867686.03931: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867686.03935: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.03959: getting variables 30575 1726867686.03960: in VariableManager get_vars() 30575 1726867686.04002: Calling all_inventory to load vars for managed_node3 30575 1726867686.04005: Calling groups_inventory to load vars for managed_node3 30575 1726867686.04007: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.04015: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.04020: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.04023: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.04894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.05760: done with get_vars() 30575 1726867686.05774: done getting variables 30575 1726867686.05816: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:28:06 -0400 (0:00:00.060) 0:02:01.435 ****** 30575 1726867686.05840: entering _queue_task() for managed_node3/service 30575 1726867686.06051: worker is 1 (out of 1 available) 30575 1726867686.06065: exiting _queue_task() for managed_node3/service 30575 1726867686.06081: done queuing things up, now waiting for results queue to drain 30575 1726867686.06083: waiting for pending results... 30575 1726867686.06256: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30575 1726867686.06346: in run() - task 0affcac9-a3a5-e081-a588-0000000026a2 30575 1726867686.06358: variable 'ansible_search_path' from source: unknown 30575 1726867686.06362: variable 'ansible_search_path' from source: unknown 30575 1726867686.06391: calling self._execute() 30575 1726867686.06465: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.06468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.06479: variable 'omit' from source: magic vars 30575 1726867686.06742: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.06750: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.06832: variable 'network_provider' from source: set_fact 30575 1726867686.06838: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867686.06841: when evaluation is False, skipping this task 30575 1726867686.06843: _execute() done 30575 1726867686.06846: dumping result to json 30575 1726867686.06850: done dumping result, returning 30575 1726867686.06857: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-e081-a588-0000000026a2] 30575 1726867686.06863: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a2 30575 1726867686.06948: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a2 30575 1726867686.06951: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30575 1726867686.07006: no more pending results, returning what we have 30575 1726867686.07009: results queue empty 30575 1726867686.07010: checking for any_errors_fatal 30575 1726867686.07015: done checking for any_errors_fatal 30575 1726867686.07016: checking for max_fail_percentage 30575 1726867686.07020: done checking for max_fail_percentage 30575 1726867686.07020: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.07021: done checking to see if all hosts have failed 30575 1726867686.07022: getting the remaining hosts for this loop 30575 1726867686.07023: done getting the remaining hosts for this loop 30575 1726867686.07026: getting the next task for host managed_node3 30575 1726867686.07034: done getting next task for host managed_node3 30575 1726867686.07037: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867686.07041: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.07064: getting variables 30575 1726867686.07066: in VariableManager get_vars() 30575 1726867686.07113: Calling all_inventory to load vars for managed_node3 30575 1726867686.07115: Calling groups_inventory to load vars for managed_node3 30575 1726867686.07120: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.07126: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.07127: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.07129: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.07861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.08737: done with get_vars() 30575 1726867686.08751: done getting variables 30575 1726867686.08792: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:28:06 -0400 (0:00:00.029) 0:02:01.465 ****** 30575 1726867686.08819: entering _queue_task() for managed_node3/copy 30575 1726867686.09024: worker is 1 (out of 1 available) 30575 1726867686.09038: exiting _queue_task() for managed_node3/copy 30575 1726867686.09050: done queuing things up, now waiting for results queue to drain 30575 1726867686.09052: waiting for pending results... 30575 1726867686.09230: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30575 1726867686.09309: in run() - task 0affcac9-a3a5-e081-a588-0000000026a3 30575 1726867686.09323: variable 'ansible_search_path' from source: unknown 30575 1726867686.09326: variable 'ansible_search_path' from source: unknown 30575 1726867686.09351: calling self._execute() 30575 1726867686.09428: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.09432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.09441: variable 'omit' from source: magic vars 30575 1726867686.09709: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.09719: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.09797: variable 'network_provider' from source: set_fact 30575 1726867686.09802: Evaluated conditional (network_provider == "initscripts"): False 30575 1726867686.09806: when evaluation is False, skipping this task 30575 1726867686.09808: _execute() done 30575 1726867686.09810: dumping result to json 30575 1726867686.09815: done dumping result, returning 30575 1726867686.09829: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-e081-a588-0000000026a3] 30575 1726867686.09832: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a3 30575 1726867686.09912: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a3 30575 1726867686.09916: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30575 1726867686.09972: no more pending results, returning what we have 30575 1726867686.09975: results queue empty 30575 1726867686.09975: checking for any_errors_fatal 30575 1726867686.09983: done checking for any_errors_fatal 30575 1726867686.09983: checking for max_fail_percentage 30575 1726867686.09985: done checking for max_fail_percentage 30575 1726867686.09986: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.09986: done checking to see if all hosts have failed 30575 1726867686.09987: getting the remaining hosts for this loop 30575 1726867686.09988: done getting the remaining hosts for this loop 30575 1726867686.09992: getting the next task for host managed_node3 30575 1726867686.09999: done getting next task for host managed_node3 30575 1726867686.10003: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867686.10007: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.10031: getting variables 30575 1726867686.10033: in VariableManager get_vars() 30575 1726867686.10069: Calling all_inventory to load vars for managed_node3 30575 1726867686.10071: Calling groups_inventory to load vars for managed_node3 30575 1726867686.10073: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.10082: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.10085: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.10087: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.10956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.11802: done with get_vars() 30575 1726867686.11820: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:28:06 -0400 (0:00:00.030) 0:02:01.496 ****** 30575 1726867686.11873: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867686.12070: worker is 1 (out of 1 available) 30575 1726867686.12086: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30575 1726867686.12099: done queuing things up, now waiting for results queue to drain 30575 1726867686.12101: waiting for pending results... 30575 1726867686.12271: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30575 1726867686.12363: in run() - task 0affcac9-a3a5-e081-a588-0000000026a4 30575 1726867686.12374: variable 'ansible_search_path' from source: unknown 30575 1726867686.12382: variable 'ansible_search_path' from source: unknown 30575 1726867686.12407: calling self._execute() 30575 1726867686.12483: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.12487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.12495: variable 'omit' from source: magic vars 30575 1726867686.12757: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.12766: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.12771: variable 'omit' from source: magic vars 30575 1726867686.12812: variable 'omit' from source: magic vars 30575 1726867686.12921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867686.14333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867686.14376: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867686.14405: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867686.14431: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867686.14450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867686.14511: variable 'network_provider' from source: set_fact 30575 1726867686.14597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867686.14621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867686.14638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867686.14664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867686.14674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867686.14730: variable 'omit' from source: magic vars 30575 1726867686.14801: variable 'omit' from source: magic vars 30575 1726867686.14872: variable 'network_connections' from source: include params 30575 1726867686.14882: variable 'interface' from source: play vars 30575 1726867686.14925: variable 'interface' from source: play vars 30575 1726867686.15029: variable 'omit' from source: magic vars 30575 1726867686.15037: variable '__lsr_ansible_managed' from source: task vars 30575 1726867686.15082: variable '__lsr_ansible_managed' from source: task vars 30575 1726867686.15207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30575 1726867686.15340: Loaded config def from plugin (lookup/template) 30575 1726867686.15343: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30575 1726867686.15363: File lookup term: get_ansible_managed.j2 30575 1726867686.15368: variable 'ansible_search_path' from source: unknown 30575 1726867686.15371: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30575 1726867686.15384: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30575 1726867686.15398: variable 'ansible_search_path' from source: unknown 30575 1726867686.18685: variable 'ansible_managed' from source: unknown 30575 1726867686.18756: variable 'omit' from source: magic vars 30575 1726867686.18774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867686.18794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867686.18808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867686.18822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.18829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.18853: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867686.18856: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.18859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.18922: Set connection var ansible_pipelining to False 30575 1726867686.18925: Set connection var ansible_shell_type to sh 30575 1726867686.18928: Set connection var ansible_shell_executable to /bin/sh 30575 1726867686.18933: Set connection var ansible_timeout to 10 30575 1726867686.18938: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867686.18944: Set connection var ansible_connection to ssh 30575 1726867686.18964: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.18967: variable 'ansible_connection' from source: unknown 30575 1726867686.18969: variable 'ansible_module_compression' from source: unknown 30575 1726867686.18971: variable 'ansible_shell_type' from source: unknown 30575 1726867686.18973: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.18976: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.18980: variable 'ansible_pipelining' from source: unknown 30575 1726867686.18984: variable 'ansible_timeout' from source: unknown 30575 1726867686.18988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.19068: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867686.19081: variable 'omit' from source: magic vars 30575 1726867686.19084: starting attempt loop 30575 1726867686.19086: running the handler 30575 1726867686.19098: _low_level_execute_command(): starting 30575 1726867686.19104: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867686.19593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.19598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.19601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.19603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.19658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.19661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867686.19664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.19725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.21384: stdout chunk (state=3): >>>/root <<< 30575 1726867686.21484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.21512: stderr chunk (state=3): >>><<< 30575 1726867686.21515: stdout chunk (state=3): >>><<< 30575 1726867686.21539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867686.21546: _low_level_execute_command(): starting 30575 1726867686.21552: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278 `" && echo ansible-tmp-1726867686.2153497-35753-142221525388278="` echo /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278 `" ) && sleep 0' 30575 1726867686.21963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.21969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.21988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.22036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.22040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.22092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.23945: stdout chunk (state=3): >>>ansible-tmp-1726867686.2153497-35753-142221525388278=/root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278 <<< 30575 1726867686.24053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.24080: stderr chunk (state=3): >>><<< 30575 1726867686.24083: stdout chunk (state=3): >>><<< 30575 1726867686.24098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867686.2153497-35753-142221525388278=/root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867686.24135: variable 'ansible_module_compression' from source: unknown 30575 1726867686.24170: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30575 1726867686.24197: variable 'ansible_facts' from source: unknown 30575 1726867686.24264: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/AnsiballZ_network_connections.py 30575 1726867686.24358: Sending initial data 30575 1726867686.24361: Sent initial data (168 bytes) 30575 1726867686.24798: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867686.24802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867686.24807: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.24810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.24812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.24856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.24859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.24908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.26430: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 30575 1726867686.26433: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867686.26472: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867686.26514: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3upiwk8e /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/AnsiballZ_network_connections.py <<< 30575 1726867686.26522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/AnsiballZ_network_connections.py" <<< 30575 1726867686.26559: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3upiwk8e" to remote "/root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/AnsiballZ_network_connections.py" <<< 30575 1726867686.26561: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/AnsiballZ_network_connections.py" <<< 30575 1726867686.27274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.27312: stderr chunk (state=3): >>><<< 30575 1726867686.27315: stdout chunk (state=3): >>><<< 30575 1726867686.27354: done transferring module to remote 30575 1726867686.27362: _low_level_execute_command(): starting 30575 1726867686.27366: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/ /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/AnsiballZ_network_connections.py && sleep 0' 30575 1726867686.27787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.27790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867686.27792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.27795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.27797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.27844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.27847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.27897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.29632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.29655: stderr chunk (state=3): >>><<< 30575 1726867686.29658: stdout chunk (state=3): >>><<< 30575 1726867686.29672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867686.29675: _low_level_execute_command(): starting 30575 1726867686.29680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/AnsiballZ_network_connections.py && sleep 0' 30575 1726867686.30082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867686.30086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867686.30102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867686.30105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.30150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867686.30154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.30210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.56011: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30575 1726867686.57699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867686.57726: stderr chunk (state=3): >>><<< 30575 1726867686.57729: stdout chunk (state=3): >>><<< 30575 1726867686.57744: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867686.57776: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867686.57786: _low_level_execute_command(): starting 30575 1726867686.57791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867686.2153497-35753-142221525388278/ > /dev/null 2>&1 && sleep 0' 30575 1726867686.58250: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867686.58253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867686.58256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.58258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867686.58260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867686.58262: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.58316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.58323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867686.58325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.58365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.60160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.60187: stderr chunk (state=3): >>><<< 30575 1726867686.60190: stdout chunk (state=3): >>><<< 30575 1726867686.60202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867686.60208: handler run complete 30575 1726867686.60228: attempt loop complete, returning result 30575 1726867686.60231: _execute() done 30575 1726867686.60234: dumping result to json 30575 1726867686.60237: done dumping result, returning 30575 1726867686.60247: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-e081-a588-0000000026a4] 30575 1726867686.60254: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a4 30575 1726867686.60352: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a4 30575 1726867686.60355: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30575 1726867686.60454: no more pending results, returning what we have 30575 1726867686.60457: results queue empty 30575 1726867686.60458: checking for any_errors_fatal 30575 1726867686.60464: done checking for any_errors_fatal 30575 1726867686.60465: checking for max_fail_percentage 30575 1726867686.60466: done checking for max_fail_percentage 30575 1726867686.60467: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.60468: done checking to see if all hosts have failed 30575 1726867686.60469: getting the remaining hosts for this loop 30575 1726867686.60470: done getting the remaining hosts for this loop 30575 1726867686.60474: getting the next task for host managed_node3 30575 1726867686.60483: done getting next task for host managed_node3 30575 1726867686.60486: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867686.60491: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.60505: getting variables 30575 1726867686.60507: in VariableManager get_vars() 30575 1726867686.60554: Calling all_inventory to load vars for managed_node3 30575 1726867686.60557: Calling groups_inventory to load vars for managed_node3 30575 1726867686.60559: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.60568: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.60570: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.60572: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.61433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.62304: done with get_vars() 30575 1726867686.62324: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:28:06 -0400 (0:00:00.505) 0:02:02.001 ****** 30575 1726867686.62386: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867686.62620: worker is 1 (out of 1 available) 30575 1726867686.62633: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30575 1726867686.62646: done queuing things up, now waiting for results queue to drain 30575 1726867686.62648: waiting for pending results... 30575 1726867686.62839: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30575 1726867686.62936: in run() - task 0affcac9-a3a5-e081-a588-0000000026a5 30575 1726867686.62952: variable 'ansible_search_path' from source: unknown 30575 1726867686.62956: variable 'ansible_search_path' from source: unknown 30575 1726867686.62985: calling self._execute() 30575 1726867686.63068: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.63072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.63076: variable 'omit' from source: magic vars 30575 1726867686.63558: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.63562: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.63579: variable 'network_state' from source: role '' defaults 30575 1726867686.63591: Evaluated conditional (network_state != {}): False 30575 1726867686.63594: when evaluation is False, skipping this task 30575 1726867686.63596: _execute() done 30575 1726867686.63602: dumping result to json 30575 1726867686.63604: done dumping result, returning 30575 1726867686.63612: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-e081-a588-0000000026a5] 30575 1726867686.63620: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a5 30575 1726867686.63713: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a5 30575 1726867686.63716: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30575 1726867686.63766: no more pending results, returning what we have 30575 1726867686.63770: results queue empty 30575 1726867686.63770: checking for any_errors_fatal 30575 1726867686.63783: done checking for any_errors_fatal 30575 1726867686.63783: checking for max_fail_percentage 30575 1726867686.63785: done checking for max_fail_percentage 30575 1726867686.63786: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.63787: done checking to see if all hosts have failed 30575 1726867686.63787: getting the remaining hosts for this loop 30575 1726867686.63790: done getting the remaining hosts for this loop 30575 1726867686.63793: getting the next task for host managed_node3 30575 1726867686.63801: done getting next task for host managed_node3 30575 1726867686.63805: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867686.63809: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.63835: getting variables 30575 1726867686.63837: in VariableManager get_vars() 30575 1726867686.63874: Calling all_inventory to load vars for managed_node3 30575 1726867686.63876: Calling groups_inventory to load vars for managed_node3 30575 1726867686.63956: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.63967: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.63970: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.63973: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.65231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.66085: done with get_vars() 30575 1726867686.66100: done getting variables 30575 1726867686.66142: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:28:06 -0400 (0:00:00.037) 0:02:02.039 ****** 30575 1726867686.66167: entering _queue_task() for managed_node3/debug 30575 1726867686.66389: worker is 1 (out of 1 available) 30575 1726867686.66401: exiting _queue_task() for managed_node3/debug 30575 1726867686.66414: done queuing things up, now waiting for results queue to drain 30575 1726867686.66416: waiting for pending results... 30575 1726867686.66612: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30575 1726867686.66715: in run() - task 0affcac9-a3a5-e081-a588-0000000026a6 30575 1726867686.66731: variable 'ansible_search_path' from source: unknown 30575 1726867686.66735: variable 'ansible_search_path' from source: unknown 30575 1726867686.66766: calling self._execute() 30575 1726867686.66846: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.66850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.66860: variable 'omit' from source: magic vars 30575 1726867686.67150: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.67159: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.67165: variable 'omit' from source: magic vars 30575 1726867686.67211: variable 'omit' from source: magic vars 30575 1726867686.67236: variable 'omit' from source: magic vars 30575 1726867686.67269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867686.67299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867686.67316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867686.67332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.67342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.67366: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867686.67369: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.67371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.67446: Set connection var ansible_pipelining to False 30575 1726867686.67449: Set connection var ansible_shell_type to sh 30575 1726867686.67452: Set connection var ansible_shell_executable to /bin/sh 30575 1726867686.67458: Set connection var ansible_timeout to 10 30575 1726867686.67463: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867686.67469: Set connection var ansible_connection to ssh 30575 1726867686.67489: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.67492: variable 'ansible_connection' from source: unknown 30575 1726867686.67495: variable 'ansible_module_compression' from source: unknown 30575 1726867686.67497: variable 'ansible_shell_type' from source: unknown 30575 1726867686.67499: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.67503: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.67506: variable 'ansible_pipelining' from source: unknown 30575 1726867686.67508: variable 'ansible_timeout' from source: unknown 30575 1726867686.67515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.67613: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867686.67624: variable 'omit' from source: magic vars 30575 1726867686.67630: starting attempt loop 30575 1726867686.67633: running the handler 30575 1726867686.67732: variable '__network_connections_result' from source: set_fact 30575 1726867686.67769: handler run complete 30575 1726867686.67783: attempt loop complete, returning result 30575 1726867686.67786: _execute() done 30575 1726867686.67788: dumping result to json 30575 1726867686.67791: done dumping result, returning 30575 1726867686.67817: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-e081-a588-0000000026a6] 30575 1726867686.67823: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a6 30575 1726867686.67907: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a6 30575 1726867686.67910: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30575 1726867686.68009: no more pending results, returning what we have 30575 1726867686.68012: results queue empty 30575 1726867686.68013: checking for any_errors_fatal 30575 1726867686.68017: done checking for any_errors_fatal 30575 1726867686.68020: checking for max_fail_percentage 30575 1726867686.68022: done checking for max_fail_percentage 30575 1726867686.68022: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.68023: done checking to see if all hosts have failed 30575 1726867686.68024: getting the remaining hosts for this loop 30575 1726867686.68025: done getting the remaining hosts for this loop 30575 1726867686.68028: getting the next task for host managed_node3 30575 1726867686.68036: done getting next task for host managed_node3 30575 1726867686.68039: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867686.68044: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.68056: getting variables 30575 1726867686.68057: in VariableManager get_vars() 30575 1726867686.68096: Calling all_inventory to load vars for managed_node3 30575 1726867686.68099: Calling groups_inventory to load vars for managed_node3 30575 1726867686.68100: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.68108: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.68111: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.68113: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.68854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.69815: done with get_vars() 30575 1726867686.69830: done getting variables 30575 1726867686.69869: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:28:06 -0400 (0:00:00.037) 0:02:02.076 ****** 30575 1726867686.69898: entering _queue_task() for managed_node3/debug 30575 1726867686.70102: worker is 1 (out of 1 available) 30575 1726867686.70117: exiting _queue_task() for managed_node3/debug 30575 1726867686.70130: done queuing things up, now waiting for results queue to drain 30575 1726867686.70131: waiting for pending results... 30575 1726867686.70318: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30575 1726867686.70399: in run() - task 0affcac9-a3a5-e081-a588-0000000026a7 30575 1726867686.70412: variable 'ansible_search_path' from source: unknown 30575 1726867686.70416: variable 'ansible_search_path' from source: unknown 30575 1726867686.70446: calling self._execute() 30575 1726867686.70528: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.70532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.70541: variable 'omit' from source: magic vars 30575 1726867686.70817: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.70828: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.70833: variable 'omit' from source: magic vars 30575 1726867686.70876: variable 'omit' from source: magic vars 30575 1726867686.70900: variable 'omit' from source: magic vars 30575 1726867686.70935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867686.70960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867686.70975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867686.70989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.71000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.71027: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867686.71030: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.71033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.71104: Set connection var ansible_pipelining to False 30575 1726867686.71108: Set connection var ansible_shell_type to sh 30575 1726867686.71113: Set connection var ansible_shell_executable to /bin/sh 30575 1726867686.71118: Set connection var ansible_timeout to 10 30575 1726867686.71125: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867686.71132: Set connection var ansible_connection to ssh 30575 1726867686.71149: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.71152: variable 'ansible_connection' from source: unknown 30575 1726867686.71155: variable 'ansible_module_compression' from source: unknown 30575 1726867686.71157: variable 'ansible_shell_type' from source: unknown 30575 1726867686.71159: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.71161: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.71163: variable 'ansible_pipelining' from source: unknown 30575 1726867686.71167: variable 'ansible_timeout' from source: unknown 30575 1726867686.71169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.71270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867686.71280: variable 'omit' from source: magic vars 30575 1726867686.71286: starting attempt loop 30575 1726867686.71289: running the handler 30575 1726867686.71329: variable '__network_connections_result' from source: set_fact 30575 1726867686.71384: variable '__network_connections_result' from source: set_fact 30575 1726867686.71463: handler run complete 30575 1726867686.71481: attempt loop complete, returning result 30575 1726867686.71484: _execute() done 30575 1726867686.71486: dumping result to json 30575 1726867686.71491: done dumping result, returning 30575 1726867686.71499: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-e081-a588-0000000026a7] 30575 1726867686.71502: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a7 30575 1726867686.71591: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a7 30575 1726867686.71594: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30575 1726867686.71682: no more pending results, returning what we have 30575 1726867686.71685: results queue empty 30575 1726867686.71686: checking for any_errors_fatal 30575 1726867686.71691: done checking for any_errors_fatal 30575 1726867686.71691: checking for max_fail_percentage 30575 1726867686.71693: done checking for max_fail_percentage 30575 1726867686.71693: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.71694: done checking to see if all hosts have failed 30575 1726867686.71695: getting the remaining hosts for this loop 30575 1726867686.71697: done getting the remaining hosts for this loop 30575 1726867686.71699: getting the next task for host managed_node3 30575 1726867686.71707: done getting next task for host managed_node3 30575 1726867686.71710: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867686.71714: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.71726: getting variables 30575 1726867686.71727: in VariableManager get_vars() 30575 1726867686.71763: Calling all_inventory to load vars for managed_node3 30575 1726867686.71765: Calling groups_inventory to load vars for managed_node3 30575 1726867686.71767: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.71784: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.71787: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.71791: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.72530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.73387: done with get_vars() 30575 1726867686.73402: done getting variables 30575 1726867686.73441: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:28:06 -0400 (0:00:00.035) 0:02:02.112 ****** 30575 1726867686.73464: entering _queue_task() for managed_node3/debug 30575 1726867686.73667: worker is 1 (out of 1 available) 30575 1726867686.73683: exiting _queue_task() for managed_node3/debug 30575 1726867686.73696: done queuing things up, now waiting for results queue to drain 30575 1726867686.73698: waiting for pending results... 30575 1726867686.73880: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30575 1726867686.73973: in run() - task 0affcac9-a3a5-e081-a588-0000000026a8 30575 1726867686.73988: variable 'ansible_search_path' from source: unknown 30575 1726867686.73991: variable 'ansible_search_path' from source: unknown 30575 1726867686.74016: calling self._execute() 30575 1726867686.74095: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.74099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.74108: variable 'omit' from source: magic vars 30575 1726867686.74382: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.74391: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.74476: variable 'network_state' from source: role '' defaults 30575 1726867686.74485: Evaluated conditional (network_state != {}): False 30575 1726867686.74488: when evaluation is False, skipping this task 30575 1726867686.74491: _execute() done 30575 1726867686.74493: dumping result to json 30575 1726867686.74497: done dumping result, returning 30575 1726867686.74505: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-e081-a588-0000000026a8] 30575 1726867686.74510: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a8 30575 1726867686.74595: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a8 30575 1726867686.74598: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30575 1726867686.74645: no more pending results, returning what we have 30575 1726867686.74648: results queue empty 30575 1726867686.74649: checking for any_errors_fatal 30575 1726867686.74654: done checking for any_errors_fatal 30575 1726867686.74655: checking for max_fail_percentage 30575 1726867686.74657: done checking for max_fail_percentage 30575 1726867686.74658: checking to see if all hosts have failed and the running result is not ok 30575 1726867686.74659: done checking to see if all hosts have failed 30575 1726867686.74659: getting the remaining hosts for this loop 30575 1726867686.74661: done getting the remaining hosts for this loop 30575 1726867686.74664: getting the next task for host managed_node3 30575 1726867686.74671: done getting next task for host managed_node3 30575 1726867686.74674: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867686.74681: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867686.74703: getting variables 30575 1726867686.74705: in VariableManager get_vars() 30575 1726867686.74743: Calling all_inventory to load vars for managed_node3 30575 1726867686.74745: Calling groups_inventory to load vars for managed_node3 30575 1726867686.74747: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867686.74755: Calling all_plugins_play to load vars for managed_node3 30575 1726867686.74757: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867686.74760: Calling groups_plugins_play to load vars for managed_node3 30575 1726867686.75631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867686.76465: done with get_vars() 30575 1726867686.76482: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:28:06 -0400 (0:00:00.030) 0:02:02.143 ****** 30575 1726867686.76544: entering _queue_task() for managed_node3/ping 30575 1726867686.76741: worker is 1 (out of 1 available) 30575 1726867686.76753: exiting _queue_task() for managed_node3/ping 30575 1726867686.76766: done queuing things up, now waiting for results queue to drain 30575 1726867686.76768: waiting for pending results... 30575 1726867686.76948: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30575 1726867686.77032: in run() - task 0affcac9-a3a5-e081-a588-0000000026a9 30575 1726867686.77044: variable 'ansible_search_path' from source: unknown 30575 1726867686.77047: variable 'ansible_search_path' from source: unknown 30575 1726867686.77074: calling self._execute() 30575 1726867686.77152: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.77155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.77164: variable 'omit' from source: magic vars 30575 1726867686.77434: variable 'ansible_distribution_major_version' from source: facts 30575 1726867686.77443: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867686.77449: variable 'omit' from source: magic vars 30575 1726867686.77493: variable 'omit' from source: magic vars 30575 1726867686.77515: variable 'omit' from source: magic vars 30575 1726867686.77548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867686.77573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867686.77590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867686.77603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.77613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867686.77639: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867686.77643: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.77645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.77713: Set connection var ansible_pipelining to False 30575 1726867686.77716: Set connection var ansible_shell_type to sh 30575 1726867686.77723: Set connection var ansible_shell_executable to /bin/sh 30575 1726867686.77729: Set connection var ansible_timeout to 10 30575 1726867686.77733: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867686.77740: Set connection var ansible_connection to ssh 30575 1726867686.77762: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.77765: variable 'ansible_connection' from source: unknown 30575 1726867686.77768: variable 'ansible_module_compression' from source: unknown 30575 1726867686.77770: variable 'ansible_shell_type' from source: unknown 30575 1726867686.77772: variable 'ansible_shell_executable' from source: unknown 30575 1726867686.77774: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867686.77776: variable 'ansible_pipelining' from source: unknown 30575 1726867686.77780: variable 'ansible_timeout' from source: unknown 30575 1726867686.77782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867686.77924: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867686.77933: variable 'omit' from source: magic vars 30575 1726867686.77939: starting attempt loop 30575 1726867686.77942: running the handler 30575 1726867686.77954: _low_level_execute_command(): starting 30575 1726867686.77961: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867686.78450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.78482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.78486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867686.78488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867686.78490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.78535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.78538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867686.78547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.78611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.80298: stdout chunk (state=3): >>>/root <<< 30575 1726867686.80395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.80420: stderr chunk (state=3): >>><<< 30575 1726867686.80425: stdout chunk (state=3): >>><<< 30575 1726867686.80444: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867686.80455: _low_level_execute_command(): starting 30575 1726867686.80460: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767 `" && echo ansible-tmp-1726867686.8044348-35768-233903127661767="` echo /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767 `" ) && sleep 0' 30575 1726867686.80871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.80874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.80886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.80888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.80933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.80940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.80986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.82868: stdout chunk (state=3): >>>ansible-tmp-1726867686.8044348-35768-233903127661767=/root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767 <<< 30575 1726867686.82970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.82995: stderr chunk (state=3): >>><<< 30575 1726867686.82998: stdout chunk (state=3): >>><<< 30575 1726867686.83011: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867686.8044348-35768-233903127661767=/root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867686.83044: variable 'ansible_module_compression' from source: unknown 30575 1726867686.83075: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30575 1726867686.83108: variable 'ansible_facts' from source: unknown 30575 1726867686.83160: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/AnsiballZ_ping.py 30575 1726867686.83253: Sending initial data 30575 1726867686.83257: Sent initial data (153 bytes) 30575 1726867686.83669: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.83672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.83674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867686.83676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867686.83681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.83722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.83726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.83782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.85320: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867686.85327: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867686.85360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867686.85407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpw3jr50q2 /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/AnsiballZ_ping.py <<< 30575 1726867686.85414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/AnsiballZ_ping.py" <<< 30575 1726867686.85450: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpw3jr50q2" to remote "/root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/AnsiballZ_ping.py" <<< 30575 1726867686.85964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.85997: stderr chunk (state=3): >>><<< 30575 1726867686.86000: stdout chunk (state=3): >>><<< 30575 1726867686.86041: done transferring module to remote 30575 1726867686.86049: _low_level_execute_command(): starting 30575 1726867686.86052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/ /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/AnsiballZ_ping.py && sleep 0' 30575 1726867686.86456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.86461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867686.86465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.86468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.86473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.86520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867686.86525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.86568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867686.88299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867686.88321: stderr chunk (state=3): >>><<< 30575 1726867686.88324: stdout chunk (state=3): >>><<< 30575 1726867686.88333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867686.88336: _low_level_execute_command(): starting 30575 1726867686.88341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/AnsiballZ_ping.py && sleep 0' 30575 1726867686.88721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867686.88725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.88744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867686.88793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867686.88797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867686.88850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.03621: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30575 1726867687.04819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867687.04848: stderr chunk (state=3): >>><<< 30575 1726867687.04852: stdout chunk (state=3): >>><<< 30575 1726867687.04870: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867687.04896: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867687.04904: _low_level_execute_command(): starting 30575 1726867687.04909: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867686.8044348-35768-233903127661767/ > /dev/null 2>&1 && sleep 0' 30575 1726867687.05366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.05369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.05371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.05374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.05375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.05432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.05439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867687.05442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.05486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.07282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.07305: stderr chunk (state=3): >>><<< 30575 1726867687.07308: stdout chunk (state=3): >>><<< 30575 1726867687.07324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.07330: handler run complete 30575 1726867687.07342: attempt loop complete, returning result 30575 1726867687.07345: _execute() done 30575 1726867687.07348: dumping result to json 30575 1726867687.07350: done dumping result, returning 30575 1726867687.07360: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-e081-a588-0000000026a9] 30575 1726867687.07362: sending task result for task 0affcac9-a3a5-e081-a588-0000000026a9 30575 1726867687.07455: done sending task result for task 0affcac9-a3a5-e081-a588-0000000026a9 30575 1726867687.07457: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30575 1726867687.07529: no more pending results, returning what we have 30575 1726867687.07532: results queue empty 30575 1726867687.07533: checking for any_errors_fatal 30575 1726867687.07539: done checking for any_errors_fatal 30575 1726867687.07540: checking for max_fail_percentage 30575 1726867687.07541: done checking for max_fail_percentage 30575 1726867687.07542: checking to see if all hosts have failed and the running result is not ok 30575 1726867687.07543: done checking to see if all hosts have failed 30575 1726867687.07544: getting the remaining hosts for this loop 30575 1726867687.07545: done getting the remaining hosts for this loop 30575 1726867687.07548: getting the next task for host managed_node3 30575 1726867687.07559: done getting next task for host managed_node3 30575 1726867687.07562: ^ task is: TASK: meta (role_complete) 30575 1726867687.07566: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867687.07584: getting variables 30575 1726867687.07586: in VariableManager get_vars() 30575 1726867687.07636: Calling all_inventory to load vars for managed_node3 30575 1726867687.07638: Calling groups_inventory to load vars for managed_node3 30575 1726867687.07640: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.07649: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.07651: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.07653: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.08485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.09339: done with get_vars() 30575 1726867687.09355: done getting variables 30575 1726867687.09413: done queuing things up, now waiting for results queue to drain 30575 1726867687.09415: results queue empty 30575 1726867687.09416: checking for any_errors_fatal 30575 1726867687.09418: done checking for any_errors_fatal 30575 1726867687.09419: checking for max_fail_percentage 30575 1726867687.09420: done checking for max_fail_percentage 30575 1726867687.09421: checking to see if all hosts have failed and the running result is not ok 30575 1726867687.09421: done checking to see if all hosts have failed 30575 1726867687.09422: getting the remaining hosts for this loop 30575 1726867687.09422: done getting the remaining hosts for this loop 30575 1726867687.09424: getting the next task for host managed_node3 30575 1726867687.09427: done getting next task for host managed_node3 30575 1726867687.09429: ^ task is: TASK: Asserts 30575 1726867687.09430: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867687.09432: getting variables 30575 1726867687.09433: in VariableManager get_vars() 30575 1726867687.09442: Calling all_inventory to load vars for managed_node3 30575 1726867687.09443: Calling groups_inventory to load vars for managed_node3 30575 1726867687.09444: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.09448: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.09449: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.09450: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.10157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.10989: done with get_vars() 30575 1726867687.11002: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:28:07 -0400 (0:00:00.345) 0:02:02.488 ****** 30575 1726867687.11054: entering _queue_task() for managed_node3/include_tasks 30575 1726867687.11334: worker is 1 (out of 1 available) 30575 1726867687.11348: exiting _queue_task() for managed_node3/include_tasks 30575 1726867687.11361: done queuing things up, now waiting for results queue to drain 30575 1726867687.11363: waiting for pending results... 30575 1726867687.11561: running TaskExecutor() for managed_node3/TASK: Asserts 30575 1726867687.11643: in run() - task 0affcac9-a3a5-e081-a588-0000000020b2 30575 1726867687.11655: variable 'ansible_search_path' from source: unknown 30575 1726867687.11659: variable 'ansible_search_path' from source: unknown 30575 1726867687.11701: variable 'lsr_assert' from source: include params 30575 1726867687.11869: variable 'lsr_assert' from source: include params 30575 1726867687.11929: variable 'omit' from source: magic vars 30575 1726867687.12030: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.12038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.12046: variable 'omit' from source: magic vars 30575 1726867687.12214: variable 'ansible_distribution_major_version' from source: facts 30575 1726867687.12224: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867687.12229: variable 'item' from source: unknown 30575 1726867687.12280: variable 'item' from source: unknown 30575 1726867687.12301: variable 'item' from source: unknown 30575 1726867687.12347: variable 'item' from source: unknown 30575 1726867687.12470: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.12473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.12475: variable 'omit' from source: magic vars 30575 1726867687.12555: variable 'ansible_distribution_major_version' from source: facts 30575 1726867687.12558: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867687.12564: variable 'item' from source: unknown 30575 1726867687.12610: variable 'item' from source: unknown 30575 1726867687.12633: variable 'item' from source: unknown 30575 1726867687.12679: variable 'item' from source: unknown 30575 1726867687.12740: dumping result to json 30575 1726867687.12744: done dumping result, returning 30575 1726867687.12746: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-e081-a588-0000000020b2] 30575 1726867687.12748: sending task result for task 0affcac9-a3a5-e081-a588-0000000020b2 30575 1726867687.12783: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020b2 30575 1726867687.12785: WORKER PROCESS EXITING 30575 1726867687.12810: no more pending results, returning what we have 30575 1726867687.12815: in VariableManager get_vars() 30575 1726867687.12863: Calling all_inventory to load vars for managed_node3 30575 1726867687.12866: Calling groups_inventory to load vars for managed_node3 30575 1726867687.12869: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.12883: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.12886: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.12889: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.13654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.14581: done with get_vars() 30575 1726867687.14594: variable 'ansible_search_path' from source: unknown 30575 1726867687.14595: variable 'ansible_search_path' from source: unknown 30575 1726867687.14622: variable 'ansible_search_path' from source: unknown 30575 1726867687.14623: variable 'ansible_search_path' from source: unknown 30575 1726867687.14639: we have included files to process 30575 1726867687.14639: generating all_blocks data 30575 1726867687.14641: done generating all_blocks data 30575 1726867687.14645: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867687.14646: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867687.14648: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30575 1726867687.14716: in VariableManager get_vars() 30575 1726867687.14731: done with get_vars() 30575 1726867687.14802: done processing included file 30575 1726867687.14804: iterating over new_blocks loaded from include file 30575 1726867687.14805: in VariableManager get_vars() 30575 1726867687.14816: done with get_vars() 30575 1726867687.14818: filtering new block on tags 30575 1726867687.14840: done filtering new block on tags 30575 1726867687.14842: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=tasks/assert_profile_absent.yml) 30575 1726867687.14845: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30575 1726867687.14845: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30575 1726867687.14847: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30575 1726867687.15081: done processing included file 30575 1726867687.15082: iterating over new_blocks loaded from include file 30575 1726867687.15083: in VariableManager get_vars() 30575 1726867687.15093: done with get_vars() 30575 1726867687.15094: filtering new block on tags 30575 1726867687.15120: done filtering new block on tags 30575 1726867687.15122: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed_node3 => (item=tasks/get_NetworkManager_NVR.yml) 30575 1726867687.15124: extending task lists for all hosts with included blocks 30575 1726867687.15738: done extending task lists 30575 1726867687.15739: done processing included files 30575 1726867687.15739: results queue empty 30575 1726867687.15740: checking for any_errors_fatal 30575 1726867687.15741: done checking for any_errors_fatal 30575 1726867687.15741: checking for max_fail_percentage 30575 1726867687.15742: done checking for max_fail_percentage 30575 1726867687.15742: checking to see if all hosts have failed and the running result is not ok 30575 1726867687.15743: done checking to see if all hosts have failed 30575 1726867687.15743: getting the remaining hosts for this loop 30575 1726867687.15744: done getting the remaining hosts for this loop 30575 1726867687.15746: getting the next task for host managed_node3 30575 1726867687.15748: done getting next task for host managed_node3 30575 1726867687.15750: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30575 1726867687.15752: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867687.15753: getting variables 30575 1726867687.15758: in VariableManager get_vars() 30575 1726867687.15765: Calling all_inventory to load vars for managed_node3 30575 1726867687.15767: Calling groups_inventory to load vars for managed_node3 30575 1726867687.15768: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.15771: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.15773: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.15774: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.16396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.17231: done with get_vars() 30575 1726867687.17245: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 17:28:07 -0400 (0:00:00.062) 0:02:02.550 ****** 30575 1726867687.17294: entering _queue_task() for managed_node3/include_tasks 30575 1726867687.17515: worker is 1 (out of 1 available) 30575 1726867687.17529: exiting _queue_task() for managed_node3/include_tasks 30575 1726867687.17542: done queuing things up, now waiting for results queue to drain 30575 1726867687.17544: waiting for pending results... 30575 1726867687.17730: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30575 1726867687.17808: in run() - task 0affcac9-a3a5-e081-a588-000000002804 30575 1726867687.17823: variable 'ansible_search_path' from source: unknown 30575 1726867687.17827: variable 'ansible_search_path' from source: unknown 30575 1726867687.17852: calling self._execute() 30575 1726867687.17922: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.17926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.17935: variable 'omit' from source: magic vars 30575 1726867687.18204: variable 'ansible_distribution_major_version' from source: facts 30575 1726867687.18208: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867687.18214: _execute() done 30575 1726867687.18221: dumping result to json 30575 1726867687.18224: done dumping result, returning 30575 1726867687.18230: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-e081-a588-000000002804] 30575 1726867687.18235: sending task result for task 0affcac9-a3a5-e081-a588-000000002804 30575 1726867687.18316: done sending task result for task 0affcac9-a3a5-e081-a588-000000002804 30575 1726867687.18322: WORKER PROCESS EXITING 30575 1726867687.18345: no more pending results, returning what we have 30575 1726867687.18350: in VariableManager get_vars() 30575 1726867687.18401: Calling all_inventory to load vars for managed_node3 30575 1726867687.18404: Calling groups_inventory to load vars for managed_node3 30575 1726867687.18408: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.18420: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.18423: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.18426: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.23249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.24085: done with get_vars() 30575 1726867687.24099: variable 'ansible_search_path' from source: unknown 30575 1726867687.24100: variable 'ansible_search_path' from source: unknown 30575 1726867687.24106: variable 'item' from source: include params 30575 1726867687.24165: variable 'item' from source: include params 30575 1726867687.24189: we have included files to process 30575 1726867687.24190: generating all_blocks data 30575 1726867687.24191: done generating all_blocks data 30575 1726867687.24192: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867687.24192: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867687.24193: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30575 1726867687.24747: done processing included file 30575 1726867687.24748: iterating over new_blocks loaded from include file 30575 1726867687.24749: in VariableManager get_vars() 30575 1726867687.24760: done with get_vars() 30575 1726867687.24761: filtering new block on tags 30575 1726867687.24801: done filtering new block on tags 30575 1726867687.24803: in VariableManager get_vars() 30575 1726867687.24812: done with get_vars() 30575 1726867687.24813: filtering new block on tags 30575 1726867687.24845: done filtering new block on tags 30575 1726867687.24847: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30575 1726867687.24850: extending task lists for all hosts with included blocks 30575 1726867687.24979: done extending task lists 30575 1726867687.24980: done processing included files 30575 1726867687.24980: results queue empty 30575 1726867687.24981: checking for any_errors_fatal 30575 1726867687.24983: done checking for any_errors_fatal 30575 1726867687.24983: checking for max_fail_percentage 30575 1726867687.24984: done checking for max_fail_percentage 30575 1726867687.24985: checking to see if all hosts have failed and the running result is not ok 30575 1726867687.24985: done checking to see if all hosts have failed 30575 1726867687.24985: getting the remaining hosts for this loop 30575 1726867687.24986: done getting the remaining hosts for this loop 30575 1726867687.24988: getting the next task for host managed_node3 30575 1726867687.24990: done getting next task for host managed_node3 30575 1726867687.24991: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867687.24993: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867687.24995: getting variables 30575 1726867687.24995: in VariableManager get_vars() 30575 1726867687.25002: Calling all_inventory to load vars for managed_node3 30575 1726867687.25003: Calling groups_inventory to load vars for managed_node3 30575 1726867687.25005: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.25010: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.25011: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.25013: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.25612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.26446: done with get_vars() 30575 1726867687.26459: done getting variables 30575 1726867687.26486: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:28:07 -0400 (0:00:00.092) 0:02:02.642 ****** 30575 1726867687.26504: entering _queue_task() for managed_node3/set_fact 30575 1726867687.26780: worker is 1 (out of 1 available) 30575 1726867687.26794: exiting _queue_task() for managed_node3/set_fact 30575 1726867687.26807: done queuing things up, now waiting for results queue to drain 30575 1726867687.26809: waiting for pending results... 30575 1726867687.26993: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30575 1726867687.27067: in run() - task 0affcac9-a3a5-e081-a588-000000002888 30575 1726867687.27082: variable 'ansible_search_path' from source: unknown 30575 1726867687.27085: variable 'ansible_search_path' from source: unknown 30575 1726867687.27111: calling self._execute() 30575 1726867687.27186: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.27189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.27198: variable 'omit' from source: magic vars 30575 1726867687.27471: variable 'ansible_distribution_major_version' from source: facts 30575 1726867687.27483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867687.27487: variable 'omit' from source: magic vars 30575 1726867687.27523: variable 'omit' from source: magic vars 30575 1726867687.27546: variable 'omit' from source: magic vars 30575 1726867687.27579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867687.27607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867687.27625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867687.27638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867687.27649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867687.27672: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867687.27676: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.27681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.27747: Set connection var ansible_pipelining to False 30575 1726867687.27751: Set connection var ansible_shell_type to sh 30575 1726867687.27754: Set connection var ansible_shell_executable to /bin/sh 30575 1726867687.27760: Set connection var ansible_timeout to 10 30575 1726867687.27765: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867687.27771: Set connection var ansible_connection to ssh 30575 1726867687.27790: variable 'ansible_shell_executable' from source: unknown 30575 1726867687.27794: variable 'ansible_connection' from source: unknown 30575 1726867687.27796: variable 'ansible_module_compression' from source: unknown 30575 1726867687.27801: variable 'ansible_shell_type' from source: unknown 30575 1726867687.27803: variable 'ansible_shell_executable' from source: unknown 30575 1726867687.27806: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.27808: variable 'ansible_pipelining' from source: unknown 30575 1726867687.27810: variable 'ansible_timeout' from source: unknown 30575 1726867687.27812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.27907: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867687.27916: variable 'omit' from source: magic vars 30575 1726867687.27923: starting attempt loop 30575 1726867687.27926: running the handler 30575 1726867687.27938: handler run complete 30575 1726867687.27946: attempt loop complete, returning result 30575 1726867687.27949: _execute() done 30575 1726867687.27951: dumping result to json 30575 1726867687.27954: done dumping result, returning 30575 1726867687.27960: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-e081-a588-000000002888] 30575 1726867687.27965: sending task result for task 0affcac9-a3a5-e081-a588-000000002888 30575 1726867687.28041: done sending task result for task 0affcac9-a3a5-e081-a588-000000002888 30575 1726867687.28044: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30575 1726867687.28094: no more pending results, returning what we have 30575 1726867687.28098: results queue empty 30575 1726867687.28099: checking for any_errors_fatal 30575 1726867687.28100: done checking for any_errors_fatal 30575 1726867687.28101: checking for max_fail_percentage 30575 1726867687.28102: done checking for max_fail_percentage 30575 1726867687.28103: checking to see if all hosts have failed and the running result is not ok 30575 1726867687.28104: done checking to see if all hosts have failed 30575 1726867687.28104: getting the remaining hosts for this loop 30575 1726867687.28106: done getting the remaining hosts for this loop 30575 1726867687.28109: getting the next task for host managed_node3 30575 1726867687.28120: done getting next task for host managed_node3 30575 1726867687.28123: ^ task is: TASK: Stat profile file 30575 1726867687.28128: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867687.28132: getting variables 30575 1726867687.28134: in VariableManager get_vars() 30575 1726867687.28203: Calling all_inventory to load vars for managed_node3 30575 1726867687.28206: Calling groups_inventory to load vars for managed_node3 30575 1726867687.28209: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.28222: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.28225: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.28228: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.29071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.29936: done with get_vars() 30575 1726867687.29951: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:28:07 -0400 (0:00:00.035) 0:02:02.677 ****** 30575 1726867687.30012: entering _queue_task() for managed_node3/stat 30575 1726867687.30234: worker is 1 (out of 1 available) 30575 1726867687.30248: exiting _queue_task() for managed_node3/stat 30575 1726867687.30263: done queuing things up, now waiting for results queue to drain 30575 1726867687.30265: waiting for pending results... 30575 1726867687.30445: running TaskExecutor() for managed_node3/TASK: Stat profile file 30575 1726867687.30516: in run() - task 0affcac9-a3a5-e081-a588-000000002889 30575 1726867687.30532: variable 'ansible_search_path' from source: unknown 30575 1726867687.30536: variable 'ansible_search_path' from source: unknown 30575 1726867687.30562: calling self._execute() 30575 1726867687.30636: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.30640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.30649: variable 'omit' from source: magic vars 30575 1726867687.30920: variable 'ansible_distribution_major_version' from source: facts 30575 1726867687.30932: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867687.30939: variable 'omit' from source: magic vars 30575 1726867687.30972: variable 'omit' from source: magic vars 30575 1726867687.31045: variable 'profile' from source: play vars 30575 1726867687.31048: variable 'interface' from source: play vars 30575 1726867687.31099: variable 'interface' from source: play vars 30575 1726867687.31113: variable 'omit' from source: magic vars 30575 1726867687.31147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867687.31179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867687.31196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867687.31210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867687.31222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867687.31243: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867687.31246: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.31248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.31321: Set connection var ansible_pipelining to False 30575 1726867687.31325: Set connection var ansible_shell_type to sh 30575 1726867687.31327: Set connection var ansible_shell_executable to /bin/sh 30575 1726867687.31332: Set connection var ansible_timeout to 10 30575 1726867687.31337: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867687.31343: Set connection var ansible_connection to ssh 30575 1726867687.31362: variable 'ansible_shell_executable' from source: unknown 30575 1726867687.31366: variable 'ansible_connection' from source: unknown 30575 1726867687.31369: variable 'ansible_module_compression' from source: unknown 30575 1726867687.31371: variable 'ansible_shell_type' from source: unknown 30575 1726867687.31373: variable 'ansible_shell_executable' from source: unknown 30575 1726867687.31375: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.31379: variable 'ansible_pipelining' from source: unknown 30575 1726867687.31382: variable 'ansible_timeout' from source: unknown 30575 1726867687.31384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.31581: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867687.31585: variable 'omit' from source: magic vars 30575 1726867687.31587: starting attempt loop 30575 1726867687.31588: running the handler 30575 1726867687.31589: _low_level_execute_command(): starting 30575 1726867687.31591: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867687.32048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.32071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.32114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.32134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867687.32138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.32196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.33883: stdout chunk (state=3): >>>/root <<< 30575 1726867687.33980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.34006: stderr chunk (state=3): >>><<< 30575 1726867687.34009: stdout chunk (state=3): >>><<< 30575 1726867687.34030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.34042: _low_level_execute_command(): starting 30575 1726867687.34047: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076 `" && echo ansible-tmp-1726867687.3402972-35779-150663315563076="` echo /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076 `" ) && sleep 0' 30575 1726867687.34460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.34464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.34473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867687.34476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867687.34479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.34521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.34529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.34572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.36466: stdout chunk (state=3): >>>ansible-tmp-1726867687.3402972-35779-150663315563076=/root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076 <<< 30575 1726867687.36573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.36601: stderr chunk (state=3): >>><<< 30575 1726867687.36604: stdout chunk (state=3): >>><<< 30575 1726867687.36616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867687.3402972-35779-150663315563076=/root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.36651: variable 'ansible_module_compression' from source: unknown 30575 1726867687.36698: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867687.36731: variable 'ansible_facts' from source: unknown 30575 1726867687.36791: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/AnsiballZ_stat.py 30575 1726867687.36883: Sending initial data 30575 1726867687.36887: Sent initial data (153 bytes) 30575 1726867687.37312: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867687.37315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.37317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.37320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.37323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.37371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.37374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.37423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.38960: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867687.38963: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867687.39003: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867687.39047: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpb0tejjm5 /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/AnsiballZ_stat.py <<< 30575 1726867687.39051: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/AnsiballZ_stat.py" <<< 30575 1726867687.39086: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpb0tejjm5" to remote "/root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/AnsiballZ_stat.py" <<< 30575 1726867687.39094: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/AnsiballZ_stat.py" <<< 30575 1726867687.39617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.39651: stderr chunk (state=3): >>><<< 30575 1726867687.39654: stdout chunk (state=3): >>><<< 30575 1726867687.39685: done transferring module to remote 30575 1726867687.39693: _low_level_execute_command(): starting 30575 1726867687.39701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/ /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/AnsiballZ_stat.py && sleep 0' 30575 1726867687.40099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867687.40102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.40105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867687.40107: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867687.40112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.40156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.40159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.40207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.41985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.42004: stderr chunk (state=3): >>><<< 30575 1726867687.42007: stdout chunk (state=3): >>><<< 30575 1726867687.42021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.42025: _low_level_execute_command(): starting 30575 1726867687.42027: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/AnsiballZ_stat.py && sleep 0' 30575 1726867687.42425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.42428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.42431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.42433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.42435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.42481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.42484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.42537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.57655: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867687.58943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867687.58973: stderr chunk (state=3): >>><<< 30575 1726867687.58976: stdout chunk (state=3): >>><<< 30575 1726867687.58996: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867687.59020: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867687.59031: _low_level_execute_command(): starting 30575 1726867687.59035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867687.3402972-35779-150663315563076/ > /dev/null 2>&1 && sleep 0' 30575 1726867687.59485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.59489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.59491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.59495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.59497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.59550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.59557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867687.59559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.59603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.61417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.61444: stderr chunk (state=3): >>><<< 30575 1726867687.61448: stdout chunk (state=3): >>><<< 30575 1726867687.61460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.61466: handler run complete 30575 1726867687.61483: attempt loop complete, returning result 30575 1726867687.61485: _execute() done 30575 1726867687.61488: dumping result to json 30575 1726867687.61492: done dumping result, returning 30575 1726867687.61499: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-e081-a588-000000002889] 30575 1726867687.61504: sending task result for task 0affcac9-a3a5-e081-a588-000000002889 30575 1726867687.61595: done sending task result for task 0affcac9-a3a5-e081-a588-000000002889 30575 1726867687.61598: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867687.61653: no more pending results, returning what we have 30575 1726867687.61657: results queue empty 30575 1726867687.61658: checking for any_errors_fatal 30575 1726867687.61665: done checking for any_errors_fatal 30575 1726867687.61665: checking for max_fail_percentage 30575 1726867687.61667: done checking for max_fail_percentage 30575 1726867687.61668: checking to see if all hosts have failed and the running result is not ok 30575 1726867687.61669: done checking to see if all hosts have failed 30575 1726867687.61669: getting the remaining hosts for this loop 30575 1726867687.61671: done getting the remaining hosts for this loop 30575 1726867687.61675: getting the next task for host managed_node3 30575 1726867687.61685: done getting next task for host managed_node3 30575 1726867687.61688: ^ task is: TASK: Set NM profile exist flag based on the profile files 30575 1726867687.61692: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867687.61696: getting variables 30575 1726867687.61697: in VariableManager get_vars() 30575 1726867687.61746: Calling all_inventory to load vars for managed_node3 30575 1726867687.61749: Calling groups_inventory to load vars for managed_node3 30575 1726867687.61752: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.61762: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.61765: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.61767: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.62608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.63600: done with get_vars() 30575 1726867687.63615: done getting variables 30575 1726867687.63658: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:28:07 -0400 (0:00:00.336) 0:02:03.014 ****** 30575 1726867687.63683: entering _queue_task() for managed_node3/set_fact 30575 1726867687.63913: worker is 1 (out of 1 available) 30575 1726867687.63926: exiting _queue_task() for managed_node3/set_fact 30575 1726867687.63940: done queuing things up, now waiting for results queue to drain 30575 1726867687.63941: waiting for pending results... 30575 1726867687.64124: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30575 1726867687.64216: in run() - task 0affcac9-a3a5-e081-a588-00000000288a 30575 1726867687.64231: variable 'ansible_search_path' from source: unknown 30575 1726867687.64235: variable 'ansible_search_path' from source: unknown 30575 1726867687.64261: calling self._execute() 30575 1726867687.64337: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.64341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.64348: variable 'omit' from source: magic vars 30575 1726867687.64625: variable 'ansible_distribution_major_version' from source: facts 30575 1726867687.64634: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867687.64725: variable 'profile_stat' from source: set_fact 30575 1726867687.64733: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867687.64736: when evaluation is False, skipping this task 30575 1726867687.64739: _execute() done 30575 1726867687.64742: dumping result to json 30575 1726867687.64745: done dumping result, returning 30575 1726867687.64752: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-e081-a588-00000000288a] 30575 1726867687.64757: sending task result for task 0affcac9-a3a5-e081-a588-00000000288a 30575 1726867687.64840: done sending task result for task 0affcac9-a3a5-e081-a588-00000000288a 30575 1726867687.64843: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867687.64894: no more pending results, returning what we have 30575 1726867687.64897: results queue empty 30575 1726867687.64898: checking for any_errors_fatal 30575 1726867687.64906: done checking for any_errors_fatal 30575 1726867687.64907: checking for max_fail_percentage 30575 1726867687.64908: done checking for max_fail_percentage 30575 1726867687.64909: checking to see if all hosts have failed and the running result is not ok 30575 1726867687.64910: done checking to see if all hosts have failed 30575 1726867687.64911: getting the remaining hosts for this loop 30575 1726867687.64912: done getting the remaining hosts for this loop 30575 1726867687.64916: getting the next task for host managed_node3 30575 1726867687.64923: done getting next task for host managed_node3 30575 1726867687.64926: ^ task is: TASK: Get NM profile info 30575 1726867687.64930: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867687.64934: getting variables 30575 1726867687.64935: in VariableManager get_vars() 30575 1726867687.64978: Calling all_inventory to load vars for managed_node3 30575 1726867687.64981: Calling groups_inventory to load vars for managed_node3 30575 1726867687.64984: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867687.64993: Calling all_plugins_play to load vars for managed_node3 30575 1726867687.64996: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867687.64998: Calling groups_plugins_play to load vars for managed_node3 30575 1726867687.66285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867687.67282: done with get_vars() 30575 1726867687.67297: done getting variables 30575 1726867687.67340: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:28:07 -0400 (0:00:00.036) 0:02:03.051 ****** 30575 1726867687.67365: entering _queue_task() for managed_node3/shell 30575 1726867687.67568: worker is 1 (out of 1 available) 30575 1726867687.67581: exiting _queue_task() for managed_node3/shell 30575 1726867687.67593: done queuing things up, now waiting for results queue to drain 30575 1726867687.67594: waiting for pending results... 30575 1726867687.67776: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30575 1726867687.67850: in run() - task 0affcac9-a3a5-e081-a588-00000000288b 30575 1726867687.67864: variable 'ansible_search_path' from source: unknown 30575 1726867687.67867: variable 'ansible_search_path' from source: unknown 30575 1726867687.67897: calling self._execute() 30575 1726867687.67972: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.67976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.67986: variable 'omit' from source: magic vars 30575 1726867687.68256: variable 'ansible_distribution_major_version' from source: facts 30575 1726867687.68269: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867687.68274: variable 'omit' from source: magic vars 30575 1726867687.68314: variable 'omit' from source: magic vars 30575 1726867687.68391: variable 'profile' from source: play vars 30575 1726867687.68395: variable 'interface' from source: play vars 30575 1726867687.68445: variable 'interface' from source: play vars 30575 1726867687.68460: variable 'omit' from source: magic vars 30575 1726867687.68496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867687.68522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867687.68539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867687.68552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867687.68562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867687.68589: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867687.68592: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.68595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.68660: Set connection var ansible_pipelining to False 30575 1726867687.68663: Set connection var ansible_shell_type to sh 30575 1726867687.68668: Set connection var ansible_shell_executable to /bin/sh 30575 1726867687.68673: Set connection var ansible_timeout to 10 30575 1726867687.68679: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867687.68691: Set connection var ansible_connection to ssh 30575 1726867687.68705: variable 'ansible_shell_executable' from source: unknown 30575 1726867687.68708: variable 'ansible_connection' from source: unknown 30575 1726867687.68710: variable 'ansible_module_compression' from source: unknown 30575 1726867687.68712: variable 'ansible_shell_type' from source: unknown 30575 1726867687.68715: variable 'ansible_shell_executable' from source: unknown 30575 1726867687.68717: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867687.68723: variable 'ansible_pipelining' from source: unknown 30575 1726867687.68726: variable 'ansible_timeout' from source: unknown 30575 1726867687.68728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867687.68830: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867687.68838: variable 'omit' from source: magic vars 30575 1726867687.68843: starting attempt loop 30575 1726867687.68846: running the handler 30575 1726867687.68855: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867687.68873: _low_level_execute_command(): starting 30575 1726867687.68880: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867687.69375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.69381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.69385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867687.69388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.69436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.69439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.69497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.71112: stdout chunk (state=3): >>>/root <<< 30575 1726867687.71213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.71242: stderr chunk (state=3): >>><<< 30575 1726867687.71246: stdout chunk (state=3): >>><<< 30575 1726867687.71264: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.71274: _low_level_execute_command(): starting 30575 1726867687.71281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793 `" && echo ansible-tmp-1726867687.7126307-35791-192058019590793="` echo /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793 `" ) && sleep 0' 30575 1726867687.71698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.71707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.71710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867687.71713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867687.71716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.71755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.71758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.71808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.73658: stdout chunk (state=3): >>>ansible-tmp-1726867687.7126307-35791-192058019590793=/root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793 <<< 30575 1726867687.73769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.73794: stderr chunk (state=3): >>><<< 30575 1726867687.73797: stdout chunk (state=3): >>><<< 30575 1726867687.73813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867687.7126307-35791-192058019590793=/root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.73836: variable 'ansible_module_compression' from source: unknown 30575 1726867687.73872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867687.73901: variable 'ansible_facts' from source: unknown 30575 1726867687.73957: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/AnsiballZ_command.py 30575 1726867687.74050: Sending initial data 30575 1726867687.74053: Sent initial data (156 bytes) 30575 1726867687.74474: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.74479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.74482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867687.74484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867687.74486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.74534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.74540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.74583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.76083: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867687.76086: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867687.76121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867687.76163: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp6rwo19ay /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/AnsiballZ_command.py <<< 30575 1726867687.76166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/AnsiballZ_command.py" <<< 30575 1726867687.76210: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp6rwo19ay" to remote "/root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/AnsiballZ_command.py" <<< 30575 1726867687.76731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.76765: stderr chunk (state=3): >>><<< 30575 1726867687.76768: stdout chunk (state=3): >>><<< 30575 1726867687.76785: done transferring module to remote 30575 1726867687.76792: _low_level_execute_command(): starting 30575 1726867687.76795: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/ /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/AnsiballZ_command.py && sleep 0' 30575 1726867687.77200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.77204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867687.77206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867687.77211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.77214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.77261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867687.77265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.77313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.79028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867687.79049: stderr chunk (state=3): >>><<< 30575 1726867687.79052: stdout chunk (state=3): >>><<< 30575 1726867687.79063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867687.79066: _low_level_execute_command(): starting 30575 1726867687.79069: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/AnsiballZ_command.py && sleep 0' 30575 1726867687.79458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.79461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.79463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867687.79465: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.79469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.79515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867687.79524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.79568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867687.96151: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:28:07.943527", "end": "2024-09-20 17:28:07.959340", "delta": "0:00:00.015813", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867687.97609: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867687.97640: stderr chunk (state=3): >>><<< 30575 1726867687.97643: stdout chunk (state=3): >>><<< 30575 1726867687.97663: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 17:28:07.943527", "end": "2024-09-20 17:28:07.959340", "delta": "0:00:00.015813", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867687.97697: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867687.97704: _low_level_execute_command(): starting 30575 1726867687.97709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867687.7126307-35791-192058019590793/ > /dev/null 2>&1 && sleep 0' 30575 1726867687.98172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.98176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.98180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867687.98182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867687.98184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867687.98234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867687.98241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867687.98291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.00091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.00119: stderr chunk (state=3): >>><<< 30575 1726867688.00125: stdout chunk (state=3): >>><<< 30575 1726867688.00138: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867688.00144: handler run complete 30575 1726867688.00161: Evaluated conditional (False): False 30575 1726867688.00170: attempt loop complete, returning result 30575 1726867688.00172: _execute() done 30575 1726867688.00175: dumping result to json 30575 1726867688.00181: done dumping result, returning 30575 1726867688.00189: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-e081-a588-00000000288b] 30575 1726867688.00193: sending task result for task 0affcac9-a3a5-e081-a588-00000000288b 30575 1726867688.00292: done sending task result for task 0affcac9-a3a5-e081-a588-00000000288b 30575 1726867688.00295: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.015813", "end": "2024-09-20 17:28:07.959340", "rc": 1, "start": "2024-09-20 17:28:07.943527" } MSG: non-zero return code ...ignoring 30575 1726867688.00391: no more pending results, returning what we have 30575 1726867688.00395: results queue empty 30575 1726867688.00396: checking for any_errors_fatal 30575 1726867688.00403: done checking for any_errors_fatal 30575 1726867688.00404: checking for max_fail_percentage 30575 1726867688.00406: done checking for max_fail_percentage 30575 1726867688.00407: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.00407: done checking to see if all hosts have failed 30575 1726867688.00408: getting the remaining hosts for this loop 30575 1726867688.00410: done getting the remaining hosts for this loop 30575 1726867688.00414: getting the next task for host managed_node3 30575 1726867688.00424: done getting next task for host managed_node3 30575 1726867688.00426: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867688.00430: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.00434: getting variables 30575 1726867688.00435: in VariableManager get_vars() 30575 1726867688.00484: Calling all_inventory to load vars for managed_node3 30575 1726867688.00487: Calling groups_inventory to load vars for managed_node3 30575 1726867688.00491: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.00501: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.00504: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.00506: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.01491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.02332: done with get_vars() 30575 1726867688.02348: done getting variables 30575 1726867688.02393: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:28:08 -0400 (0:00:00.350) 0:02:03.401 ****** 30575 1726867688.02416: entering _queue_task() for managed_node3/set_fact 30575 1726867688.02655: worker is 1 (out of 1 available) 30575 1726867688.02668: exiting _queue_task() for managed_node3/set_fact 30575 1726867688.02684: done queuing things up, now waiting for results queue to drain 30575 1726867688.02686: waiting for pending results... 30575 1726867688.02881: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30575 1726867688.02978: in run() - task 0affcac9-a3a5-e081-a588-00000000288c 30575 1726867688.02991: variable 'ansible_search_path' from source: unknown 30575 1726867688.02995: variable 'ansible_search_path' from source: unknown 30575 1726867688.03028: calling self._execute() 30575 1726867688.03100: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.03104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.03112: variable 'omit' from source: magic vars 30575 1726867688.03401: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.03410: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.03508: variable 'nm_profile_exists' from source: set_fact 30575 1726867688.03517: Evaluated conditional (nm_profile_exists.rc == 0): False 30575 1726867688.03520: when evaluation is False, skipping this task 30575 1726867688.03525: _execute() done 30575 1726867688.03529: dumping result to json 30575 1726867688.03531: done dumping result, returning 30575 1726867688.03539: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-e081-a588-00000000288c] 30575 1726867688.03544: sending task result for task 0affcac9-a3a5-e081-a588-00000000288c 30575 1726867688.03633: done sending task result for task 0affcac9-a3a5-e081-a588-00000000288c 30575 1726867688.03636: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30575 1726867688.03715: no more pending results, returning what we have 30575 1726867688.03722: results queue empty 30575 1726867688.03722: checking for any_errors_fatal 30575 1726867688.03731: done checking for any_errors_fatal 30575 1726867688.03731: checking for max_fail_percentage 30575 1726867688.03733: done checking for max_fail_percentage 30575 1726867688.03733: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.03734: done checking to see if all hosts have failed 30575 1726867688.03735: getting the remaining hosts for this loop 30575 1726867688.03736: done getting the remaining hosts for this loop 30575 1726867688.03739: getting the next task for host managed_node3 30575 1726867688.03750: done getting next task for host managed_node3 30575 1726867688.03752: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867688.03757: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.03760: getting variables 30575 1726867688.03762: in VariableManager get_vars() 30575 1726867688.03801: Calling all_inventory to load vars for managed_node3 30575 1726867688.03804: Calling groups_inventory to load vars for managed_node3 30575 1726867688.03807: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.03816: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.03821: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.03824: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.04583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.05452: done with get_vars() 30575 1726867688.05467: done getting variables 30575 1726867688.05510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867688.05596: variable 'profile' from source: play vars 30575 1726867688.05599: variable 'interface' from source: play vars 30575 1726867688.05642: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:28:08 -0400 (0:00:00.032) 0:02:03.434 ****** 30575 1726867688.05666: entering _queue_task() for managed_node3/command 30575 1726867688.05915: worker is 1 (out of 1 available) 30575 1726867688.05932: exiting _queue_task() for managed_node3/command 30575 1726867688.05946: done queuing things up, now waiting for results queue to drain 30575 1726867688.05948: waiting for pending results... 30575 1726867688.06141: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30575 1726867688.06228: in run() - task 0affcac9-a3a5-e081-a588-00000000288e 30575 1726867688.06241: variable 'ansible_search_path' from source: unknown 30575 1726867688.06245: variable 'ansible_search_path' from source: unknown 30575 1726867688.06273: calling self._execute() 30575 1726867688.06348: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.06351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.06360: variable 'omit' from source: magic vars 30575 1726867688.06636: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.06646: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.06735: variable 'profile_stat' from source: set_fact 30575 1726867688.06743: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867688.06746: when evaluation is False, skipping this task 30575 1726867688.06749: _execute() done 30575 1726867688.06752: dumping result to json 30575 1726867688.06754: done dumping result, returning 30575 1726867688.06762: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000288e] 30575 1726867688.06767: sending task result for task 0affcac9-a3a5-e081-a588-00000000288e 30575 1726867688.06852: done sending task result for task 0affcac9-a3a5-e081-a588-00000000288e 30575 1726867688.06854: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867688.06908: no more pending results, returning what we have 30575 1726867688.06912: results queue empty 30575 1726867688.06913: checking for any_errors_fatal 30575 1726867688.06925: done checking for any_errors_fatal 30575 1726867688.06925: checking for max_fail_percentage 30575 1726867688.06927: done checking for max_fail_percentage 30575 1726867688.06928: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.06929: done checking to see if all hosts have failed 30575 1726867688.06929: getting the remaining hosts for this loop 30575 1726867688.06931: done getting the remaining hosts for this loop 30575 1726867688.06934: getting the next task for host managed_node3 30575 1726867688.06944: done getting next task for host managed_node3 30575 1726867688.06947: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30575 1726867688.06951: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.06955: getting variables 30575 1726867688.06957: in VariableManager get_vars() 30575 1726867688.06998: Calling all_inventory to load vars for managed_node3 30575 1726867688.07001: Calling groups_inventory to load vars for managed_node3 30575 1726867688.07004: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.07015: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.07020: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.07023: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.07952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.08802: done with get_vars() 30575 1726867688.08821: done getting variables 30575 1726867688.08862: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867688.08938: variable 'profile' from source: play vars 30575 1726867688.08941: variable 'interface' from source: play vars 30575 1726867688.08980: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:28:08 -0400 (0:00:00.033) 0:02:03.467 ****** 30575 1726867688.09003: entering _queue_task() for managed_node3/set_fact 30575 1726867688.09240: worker is 1 (out of 1 available) 30575 1726867688.09254: exiting _queue_task() for managed_node3/set_fact 30575 1726867688.09268: done queuing things up, now waiting for results queue to drain 30575 1726867688.09270: waiting for pending results... 30575 1726867688.09455: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30575 1726867688.09536: in run() - task 0affcac9-a3a5-e081-a588-00000000288f 30575 1726867688.09550: variable 'ansible_search_path' from source: unknown 30575 1726867688.09554: variable 'ansible_search_path' from source: unknown 30575 1726867688.09584: calling self._execute() 30575 1726867688.09665: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.09669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.09679: variable 'omit' from source: magic vars 30575 1726867688.09956: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.09966: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.10056: variable 'profile_stat' from source: set_fact 30575 1726867688.10066: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867688.10069: when evaluation is False, skipping this task 30575 1726867688.10072: _execute() done 30575 1726867688.10074: dumping result to json 30575 1726867688.10080: done dumping result, returning 30575 1726867688.10087: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-00000000288f] 30575 1726867688.10092: sending task result for task 0affcac9-a3a5-e081-a588-00000000288f 30575 1726867688.10184: done sending task result for task 0affcac9-a3a5-e081-a588-00000000288f 30575 1726867688.10187: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867688.10233: no more pending results, returning what we have 30575 1726867688.10237: results queue empty 30575 1726867688.10238: checking for any_errors_fatal 30575 1726867688.10246: done checking for any_errors_fatal 30575 1726867688.10247: checking for max_fail_percentage 30575 1726867688.10248: done checking for max_fail_percentage 30575 1726867688.10249: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.10250: done checking to see if all hosts have failed 30575 1726867688.10251: getting the remaining hosts for this loop 30575 1726867688.10252: done getting the remaining hosts for this loop 30575 1726867688.10256: getting the next task for host managed_node3 30575 1726867688.10263: done getting next task for host managed_node3 30575 1726867688.10266: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30575 1726867688.10270: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.10274: getting variables 30575 1726867688.10276: in VariableManager get_vars() 30575 1726867688.10321: Calling all_inventory to load vars for managed_node3 30575 1726867688.10323: Calling groups_inventory to load vars for managed_node3 30575 1726867688.10326: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.10338: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.10340: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.10342: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.11142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.12007: done with get_vars() 30575 1726867688.12024: done getting variables 30575 1726867688.12065: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867688.12140: variable 'profile' from source: play vars 30575 1726867688.12144: variable 'interface' from source: play vars 30575 1726867688.12184: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:28:08 -0400 (0:00:00.032) 0:02:03.499 ****** 30575 1726867688.12207: entering _queue_task() for managed_node3/command 30575 1726867688.12431: worker is 1 (out of 1 available) 30575 1726867688.12444: exiting _queue_task() for managed_node3/command 30575 1726867688.12457: done queuing things up, now waiting for results queue to drain 30575 1726867688.12459: waiting for pending results... 30575 1726867688.12640: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30575 1726867688.12722: in run() - task 0affcac9-a3a5-e081-a588-000000002890 30575 1726867688.12732: variable 'ansible_search_path' from source: unknown 30575 1726867688.12735: variable 'ansible_search_path' from source: unknown 30575 1726867688.12762: calling self._execute() 30575 1726867688.12835: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.12838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.12848: variable 'omit' from source: magic vars 30575 1726867688.13116: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.13127: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.13210: variable 'profile_stat' from source: set_fact 30575 1726867688.13221: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867688.13225: when evaluation is False, skipping this task 30575 1726867688.13229: _execute() done 30575 1726867688.13232: dumping result to json 30575 1726867688.13235: done dumping result, returning 30575 1726867688.13237: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000002890] 30575 1726867688.13246: sending task result for task 0affcac9-a3a5-e081-a588-000000002890 30575 1726867688.13326: done sending task result for task 0affcac9-a3a5-e081-a588-000000002890 30575 1726867688.13329: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867688.13394: no more pending results, returning what we have 30575 1726867688.13398: results queue empty 30575 1726867688.13399: checking for any_errors_fatal 30575 1726867688.13403: done checking for any_errors_fatal 30575 1726867688.13404: checking for max_fail_percentage 30575 1726867688.13405: done checking for max_fail_percentage 30575 1726867688.13406: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.13407: done checking to see if all hosts have failed 30575 1726867688.13408: getting the remaining hosts for this loop 30575 1726867688.13409: done getting the remaining hosts for this loop 30575 1726867688.13412: getting the next task for host managed_node3 30575 1726867688.13422: done getting next task for host managed_node3 30575 1726867688.13425: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30575 1726867688.13429: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.13432: getting variables 30575 1726867688.13433: in VariableManager get_vars() 30575 1726867688.13467: Calling all_inventory to load vars for managed_node3 30575 1726867688.13469: Calling groups_inventory to load vars for managed_node3 30575 1726867688.13472: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.13482: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.13485: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.13488: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.14349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.15200: done with get_vars() 30575 1726867688.15216: done getting variables 30575 1726867688.15259: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867688.15334: variable 'profile' from source: play vars 30575 1726867688.15337: variable 'interface' from source: play vars 30575 1726867688.15378: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:28:08 -0400 (0:00:00.031) 0:02:03.531 ****** 30575 1726867688.15401: entering _queue_task() for managed_node3/set_fact 30575 1726867688.15627: worker is 1 (out of 1 available) 30575 1726867688.15640: exiting _queue_task() for managed_node3/set_fact 30575 1726867688.15653: done queuing things up, now waiting for results queue to drain 30575 1726867688.15655: waiting for pending results... 30575 1726867688.15845: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30575 1726867688.15946: in run() - task 0affcac9-a3a5-e081-a588-000000002891 30575 1726867688.15957: variable 'ansible_search_path' from source: unknown 30575 1726867688.15960: variable 'ansible_search_path' from source: unknown 30575 1726867688.15994: calling self._execute() 30575 1726867688.16070: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.16074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.16085: variable 'omit' from source: magic vars 30575 1726867688.16357: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.16366: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.16457: variable 'profile_stat' from source: set_fact 30575 1726867688.16466: Evaluated conditional (profile_stat.stat.exists): False 30575 1726867688.16469: when evaluation is False, skipping this task 30575 1726867688.16472: _execute() done 30575 1726867688.16474: dumping result to json 30575 1726867688.16479: done dumping result, returning 30575 1726867688.16486: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcac9-a3a5-e081-a588-000000002891] 30575 1726867688.16491: sending task result for task 0affcac9-a3a5-e081-a588-000000002891 30575 1726867688.16573: done sending task result for task 0affcac9-a3a5-e081-a588-000000002891 30575 1726867688.16576: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30575 1726867688.16622: no more pending results, returning what we have 30575 1726867688.16627: results queue empty 30575 1726867688.16628: checking for any_errors_fatal 30575 1726867688.16636: done checking for any_errors_fatal 30575 1726867688.16636: checking for max_fail_percentage 30575 1726867688.16638: done checking for max_fail_percentage 30575 1726867688.16639: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.16640: done checking to see if all hosts have failed 30575 1726867688.16641: getting the remaining hosts for this loop 30575 1726867688.16642: done getting the remaining hosts for this loop 30575 1726867688.16645: getting the next task for host managed_node3 30575 1726867688.16655: done getting next task for host managed_node3 30575 1726867688.16658: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30575 1726867688.16661: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.16665: getting variables 30575 1726867688.16667: in VariableManager get_vars() 30575 1726867688.16707: Calling all_inventory to load vars for managed_node3 30575 1726867688.16710: Calling groups_inventory to load vars for managed_node3 30575 1726867688.16713: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.16724: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.16726: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.16729: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.17488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.18451: done with get_vars() 30575 1726867688.18467: done getting variables 30575 1726867688.18508: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867688.18582: variable 'profile' from source: play vars 30575 1726867688.18585: variable 'interface' from source: play vars 30575 1726867688.18623: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 17:28:08 -0400 (0:00:00.032) 0:02:03.564 ****** 30575 1726867688.18645: entering _queue_task() for managed_node3/assert 30575 1726867688.18857: worker is 1 (out of 1 available) 30575 1726867688.18872: exiting _queue_task() for managed_node3/assert 30575 1726867688.18886: done queuing things up, now waiting for results queue to drain 30575 1726867688.18888: waiting for pending results... 30575 1726867688.19073: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' 30575 1726867688.19149: in run() - task 0affcac9-a3a5-e081-a588-000000002805 30575 1726867688.19161: variable 'ansible_search_path' from source: unknown 30575 1726867688.19164: variable 'ansible_search_path' from source: unknown 30575 1726867688.19196: calling self._execute() 30575 1726867688.19273: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.19276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.19286: variable 'omit' from source: magic vars 30575 1726867688.19551: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.19564: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.19570: variable 'omit' from source: magic vars 30575 1726867688.19604: variable 'omit' from source: magic vars 30575 1726867688.19674: variable 'profile' from source: play vars 30575 1726867688.19680: variable 'interface' from source: play vars 30575 1726867688.19727: variable 'interface' from source: play vars 30575 1726867688.19742: variable 'omit' from source: magic vars 30575 1726867688.19775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867688.19804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867688.19822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867688.19833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.19844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.19868: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867688.19873: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.19876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.19945: Set connection var ansible_pipelining to False 30575 1726867688.19948: Set connection var ansible_shell_type to sh 30575 1726867688.19954: Set connection var ansible_shell_executable to /bin/sh 30575 1726867688.19959: Set connection var ansible_timeout to 10 30575 1726867688.19964: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867688.19970: Set connection var ansible_connection to ssh 30575 1726867688.19994: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.19997: variable 'ansible_connection' from source: unknown 30575 1726867688.19999: variable 'ansible_module_compression' from source: unknown 30575 1726867688.20001: variable 'ansible_shell_type' from source: unknown 30575 1726867688.20004: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.20006: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.20008: variable 'ansible_pipelining' from source: unknown 30575 1726867688.20010: variable 'ansible_timeout' from source: unknown 30575 1726867688.20012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.20109: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867688.20122: variable 'omit' from source: magic vars 30575 1726867688.20125: starting attempt loop 30575 1726867688.20128: running the handler 30575 1726867688.20213: variable 'lsr_net_profile_exists' from source: set_fact 30575 1726867688.20216: Evaluated conditional (not lsr_net_profile_exists): True 30575 1726867688.20222: handler run complete 30575 1726867688.20230: attempt loop complete, returning result 30575 1726867688.20233: _execute() done 30575 1726867688.20235: dumping result to json 30575 1726867688.20238: done dumping result, returning 30575 1726867688.20244: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' [0affcac9-a3a5-e081-a588-000000002805] 30575 1726867688.20250: sending task result for task 0affcac9-a3a5-e081-a588-000000002805 30575 1726867688.20331: done sending task result for task 0affcac9-a3a5-e081-a588-000000002805 30575 1726867688.20334: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867688.20383: no more pending results, returning what we have 30575 1726867688.20387: results queue empty 30575 1726867688.20387: checking for any_errors_fatal 30575 1726867688.20394: done checking for any_errors_fatal 30575 1726867688.20395: checking for max_fail_percentage 30575 1726867688.20396: done checking for max_fail_percentage 30575 1726867688.20397: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.20398: done checking to see if all hosts have failed 30575 1726867688.20398: getting the remaining hosts for this loop 30575 1726867688.20400: done getting the remaining hosts for this loop 30575 1726867688.20403: getting the next task for host managed_node3 30575 1726867688.20413: done getting next task for host managed_node3 30575 1726867688.20416: ^ task is: TASK: Get NetworkManager RPM version 30575 1726867688.20421: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.20425: getting variables 30575 1726867688.20426: in VariableManager get_vars() 30575 1726867688.20466: Calling all_inventory to load vars for managed_node3 30575 1726867688.20468: Calling groups_inventory to load vars for managed_node3 30575 1726867688.20472: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.20483: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.20485: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.20488: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.21247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.22120: done with get_vars() 30575 1726867688.22134: done getting variables 30575 1726867688.22175: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Friday 20 September 2024 17:28:08 -0400 (0:00:00.035) 0:02:03.599 ****** 30575 1726867688.22202: entering _queue_task() for managed_node3/command 30575 1726867688.22405: worker is 1 (out of 1 available) 30575 1726867688.22420: exiting _queue_task() for managed_node3/command 30575 1726867688.22434: done queuing things up, now waiting for results queue to drain 30575 1726867688.22435: waiting for pending results... 30575 1726867688.22609: running TaskExecutor() for managed_node3/TASK: Get NetworkManager RPM version 30575 1726867688.22678: in run() - task 0affcac9-a3a5-e081-a588-000000002809 30575 1726867688.22693: variable 'ansible_search_path' from source: unknown 30575 1726867688.22696: variable 'ansible_search_path' from source: unknown 30575 1726867688.22723: calling self._execute() 30575 1726867688.22800: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.22803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.22811: variable 'omit' from source: magic vars 30575 1726867688.23072: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.23082: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.23089: variable 'omit' from source: magic vars 30575 1726867688.23181: variable 'omit' from source: magic vars 30575 1726867688.23184: variable 'omit' from source: magic vars 30575 1726867688.23185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867688.23207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867688.23223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867688.23235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.23245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.23271: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867688.23274: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.23276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.23349: Set connection var ansible_pipelining to False 30575 1726867688.23352: Set connection var ansible_shell_type to sh 30575 1726867688.23357: Set connection var ansible_shell_executable to /bin/sh 30575 1726867688.23362: Set connection var ansible_timeout to 10 30575 1726867688.23367: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867688.23373: Set connection var ansible_connection to ssh 30575 1726867688.23393: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.23396: variable 'ansible_connection' from source: unknown 30575 1726867688.23399: variable 'ansible_module_compression' from source: unknown 30575 1726867688.23401: variable 'ansible_shell_type' from source: unknown 30575 1726867688.23403: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.23405: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.23407: variable 'ansible_pipelining' from source: unknown 30575 1726867688.23410: variable 'ansible_timeout' from source: unknown 30575 1726867688.23416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.23513: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867688.23537: variable 'omit' from source: magic vars 30575 1726867688.23540: starting attempt loop 30575 1726867688.23542: running the handler 30575 1726867688.23545: _low_level_execute_command(): starting 30575 1726867688.23548: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867688.24046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.24082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867688.24086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.24089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.24091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.24132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867688.24145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.24210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.25896: stdout chunk (state=3): >>>/root <<< 30575 1726867688.25995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.26027: stderr chunk (state=3): >>><<< 30575 1726867688.26030: stdout chunk (state=3): >>><<< 30575 1726867688.26050: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867688.26060: _low_level_execute_command(): starting 30575 1726867688.26067: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822 `" && echo ansible-tmp-1726867688.2604947-35805-95456947720822="` echo /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822 `" ) && sleep 0' 30575 1726867688.26506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867688.26509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867688.26512: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.26524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.26526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.26568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867688.26576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.26622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.28480: stdout chunk (state=3): >>>ansible-tmp-1726867688.2604947-35805-95456947720822=/root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822 <<< 30575 1726867688.28588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.28612: stderr chunk (state=3): >>><<< 30575 1726867688.28615: stdout chunk (state=3): >>><<< 30575 1726867688.28630: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867688.2604947-35805-95456947720822=/root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867688.28656: variable 'ansible_module_compression' from source: unknown 30575 1726867688.28700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867688.28731: variable 'ansible_facts' from source: unknown 30575 1726867688.28787: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/AnsiballZ_command.py 30575 1726867688.28888: Sending initial data 30575 1726867688.28891: Sent initial data (155 bytes) 30575 1726867688.29331: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.29335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867688.29337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.29340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.29341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.29380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867688.29395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.29443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.30987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867688.30990: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867688.31025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867688.31073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkw7zyqpf /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/AnsiballZ_command.py <<< 30575 1726867688.31080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/AnsiballZ_command.py" <<< 30575 1726867688.31118: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmpkw7zyqpf" to remote "/root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/AnsiballZ_command.py" <<< 30575 1726867688.31646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.31684: stderr chunk (state=3): >>><<< 30575 1726867688.31687: stdout chunk (state=3): >>><<< 30575 1726867688.31730: done transferring module to remote 30575 1726867688.31738: _low_level_execute_command(): starting 30575 1726867688.31743: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/ /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/AnsiballZ_command.py && sleep 0' 30575 1726867688.32163: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.32166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867688.32169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867688.32171: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867688.32178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.32223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867688.32226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.32271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.33986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.34012: stderr chunk (state=3): >>><<< 30575 1726867688.34015: stdout chunk (state=3): >>><<< 30575 1726867688.34031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867688.34034: _low_level_execute_command(): starting 30575 1726867688.34037: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/AnsiballZ_command.py && sleep 0' 30575 1726867688.34441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.34444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.34446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.34448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.34496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867688.34499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.34552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.65931: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 17:28:08.494592", "end": "2024-09-20 17:28:08.657044", "delta": "0:00:00.162452", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867688.67601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867688.67608: stderr chunk (state=3): >>><<< 30575 1726867688.67611: stdout chunk (state=3): >>><<< 30575 1726867688.67629: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 17:28:08.494592", "end": "2024-09-20 17:28:08.657044", "delta": "0:00:00.162452", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867688.67657: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867688.67664: _low_level_execute_command(): starting 30575 1726867688.67669: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867688.2604947-35805-95456947720822/ > /dev/null 2>&1 && sleep 0' 30575 1726867688.68112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.68117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867688.68125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867688.68127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.68130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.68181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867688.68188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867688.68189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.68230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.70053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.70076: stderr chunk (state=3): >>><<< 30575 1726867688.70081: stdout chunk (state=3): >>><<< 30575 1726867688.70094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867688.70100: handler run complete 30575 1726867688.70119: Evaluated conditional (False): False 30575 1726867688.70126: attempt loop complete, returning result 30575 1726867688.70129: _execute() done 30575 1726867688.70131: dumping result to json 30575 1726867688.70136: done dumping result, returning 30575 1726867688.70143: done running TaskExecutor() for managed_node3/TASK: Get NetworkManager RPM version [0affcac9-a3a5-e081-a588-000000002809] 30575 1726867688.70148: sending task result for task 0affcac9-a3a5-e081-a588-000000002809 30575 1726867688.70246: done sending task result for task 0affcac9-a3a5-e081-a588-000000002809 30575 1726867688.70249: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.162452", "end": "2024-09-20 17:28:08.657044", "rc": 0, "start": "2024-09-20 17:28:08.494592" } STDOUT: NetworkManager-1.48.10-1.el10 30575 1726867688.70320: no more pending results, returning what we have 30575 1726867688.70324: results queue empty 30575 1726867688.70325: checking for any_errors_fatal 30575 1726867688.70331: done checking for any_errors_fatal 30575 1726867688.70331: checking for max_fail_percentage 30575 1726867688.70333: done checking for max_fail_percentage 30575 1726867688.70334: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.70335: done checking to see if all hosts have failed 30575 1726867688.70336: getting the remaining hosts for this loop 30575 1726867688.70337: done getting the remaining hosts for this loop 30575 1726867688.70341: getting the next task for host managed_node3 30575 1726867688.70349: done getting next task for host managed_node3 30575 1726867688.70351: ^ task is: TASK: Store NetworkManager version 30575 1726867688.70354: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.70358: getting variables 30575 1726867688.70359: in VariableManager get_vars() 30575 1726867688.70414: Calling all_inventory to load vars for managed_node3 30575 1726867688.70417: Calling groups_inventory to load vars for managed_node3 30575 1726867688.70422: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.70432: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.70434: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.70437: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.71427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.72266: done with get_vars() 30575 1726867688.72284: done getting variables 30575 1726867688.72329: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Friday 20 September 2024 17:28:08 -0400 (0:00:00.501) 0:02:04.101 ****** 30575 1726867688.72352: entering _queue_task() for managed_node3/set_fact 30575 1726867688.72574: worker is 1 (out of 1 available) 30575 1726867688.72589: exiting _queue_task() for managed_node3/set_fact 30575 1726867688.72601: done queuing things up, now waiting for results queue to drain 30575 1726867688.72602: waiting for pending results... 30575 1726867688.72788: running TaskExecutor() for managed_node3/TASK: Store NetworkManager version 30575 1726867688.72880: in run() - task 0affcac9-a3a5-e081-a588-00000000280a 30575 1726867688.72892: variable 'ansible_search_path' from source: unknown 30575 1726867688.72895: variable 'ansible_search_path' from source: unknown 30575 1726867688.72925: calling self._execute() 30575 1726867688.73003: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.73006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.73015: variable 'omit' from source: magic vars 30575 1726867688.73287: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.73296: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.73303: variable 'omit' from source: magic vars 30575 1726867688.73335: variable 'omit' from source: magic vars 30575 1726867688.73414: variable '__rpm_q_networkmanager' from source: set_fact 30575 1726867688.73433: variable 'omit' from source: magic vars 30575 1726867688.73465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867688.73495: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867688.73513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867688.73527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.73537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.73562: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867688.73565: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.73568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.73641: Set connection var ansible_pipelining to False 30575 1726867688.73644: Set connection var ansible_shell_type to sh 30575 1726867688.73648: Set connection var ansible_shell_executable to /bin/sh 30575 1726867688.73654: Set connection var ansible_timeout to 10 30575 1726867688.73659: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867688.73665: Set connection var ansible_connection to ssh 30575 1726867688.73685: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.73689: variable 'ansible_connection' from source: unknown 30575 1726867688.73691: variable 'ansible_module_compression' from source: unknown 30575 1726867688.73695: variable 'ansible_shell_type' from source: unknown 30575 1726867688.73697: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.73699: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.73701: variable 'ansible_pipelining' from source: unknown 30575 1726867688.73703: variable 'ansible_timeout' from source: unknown 30575 1726867688.73705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.73803: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867688.73813: variable 'omit' from source: magic vars 30575 1726867688.73820: starting attempt loop 30575 1726867688.73823: running the handler 30575 1726867688.73834: handler run complete 30575 1726867688.73843: attempt loop complete, returning result 30575 1726867688.73845: _execute() done 30575 1726867688.73848: dumping result to json 30575 1726867688.73850: done dumping result, returning 30575 1726867688.73857: done running TaskExecutor() for managed_node3/TASK: Store NetworkManager version [0affcac9-a3a5-e081-a588-00000000280a] 30575 1726867688.73861: sending task result for task 0affcac9-a3a5-e081-a588-00000000280a 30575 1726867688.73943: done sending task result for task 0affcac9-a3a5-e081-a588-00000000280a 30575 1726867688.73946: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" }, "changed": false } 30575 1726867688.74003: no more pending results, returning what we have 30575 1726867688.74006: results queue empty 30575 1726867688.74007: checking for any_errors_fatal 30575 1726867688.74015: done checking for any_errors_fatal 30575 1726867688.74016: checking for max_fail_percentage 30575 1726867688.74020: done checking for max_fail_percentage 30575 1726867688.74021: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.74022: done checking to see if all hosts have failed 30575 1726867688.74022: getting the remaining hosts for this loop 30575 1726867688.74024: done getting the remaining hosts for this loop 30575 1726867688.74027: getting the next task for host managed_node3 30575 1726867688.74034: done getting next task for host managed_node3 30575 1726867688.74036: ^ task is: TASK: Show NetworkManager version 30575 1726867688.74039: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.74042: getting variables 30575 1726867688.74043: in VariableManager get_vars() 30575 1726867688.74086: Calling all_inventory to load vars for managed_node3 30575 1726867688.74089: Calling groups_inventory to load vars for managed_node3 30575 1726867688.74092: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.74102: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.74104: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.74107: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.74874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.75743: done with get_vars() 30575 1726867688.75758: done getting variables 30575 1726867688.75800: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Friday 20 September 2024 17:28:08 -0400 (0:00:00.034) 0:02:04.135 ****** 30575 1726867688.75825: entering _queue_task() for managed_node3/debug 30575 1726867688.76036: worker is 1 (out of 1 available) 30575 1726867688.76050: exiting _queue_task() for managed_node3/debug 30575 1726867688.76063: done queuing things up, now waiting for results queue to drain 30575 1726867688.76065: waiting for pending results... 30575 1726867688.76238: running TaskExecutor() for managed_node3/TASK: Show NetworkManager version 30575 1726867688.76309: in run() - task 0affcac9-a3a5-e081-a588-00000000280b 30575 1726867688.76323: variable 'ansible_search_path' from source: unknown 30575 1726867688.76327: variable 'ansible_search_path' from source: unknown 30575 1726867688.76352: calling self._execute() 30575 1726867688.76427: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.76431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.76439: variable 'omit' from source: magic vars 30575 1726867688.76699: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.76708: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.76714: variable 'omit' from source: magic vars 30575 1726867688.76747: variable 'omit' from source: magic vars 30575 1726867688.76769: variable 'omit' from source: magic vars 30575 1726867688.76800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867688.76826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867688.76845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867688.76859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.76868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.76894: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867688.76897: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.76900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.76970: Set connection var ansible_pipelining to False 30575 1726867688.76973: Set connection var ansible_shell_type to sh 30575 1726867688.76979: Set connection var ansible_shell_executable to /bin/sh 30575 1726867688.76985: Set connection var ansible_timeout to 10 30575 1726867688.76990: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867688.76996: Set connection var ansible_connection to ssh 30575 1726867688.77014: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.77017: variable 'ansible_connection' from source: unknown 30575 1726867688.77022: variable 'ansible_module_compression' from source: unknown 30575 1726867688.77025: variable 'ansible_shell_type' from source: unknown 30575 1726867688.77027: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.77029: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.77031: variable 'ansible_pipelining' from source: unknown 30575 1726867688.77033: variable 'ansible_timeout' from source: unknown 30575 1726867688.77035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.77131: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867688.77140: variable 'omit' from source: magic vars 30575 1726867688.77145: starting attempt loop 30575 1726867688.77148: running the handler 30575 1726867688.77188: variable 'networkmanager_nvr' from source: set_fact 30575 1726867688.77241: variable 'networkmanager_nvr' from source: set_fact 30575 1726867688.77250: handler run complete 30575 1726867688.77266: attempt loop complete, returning result 30575 1726867688.77269: _execute() done 30575 1726867688.77272: dumping result to json 30575 1726867688.77274: done dumping result, returning 30575 1726867688.77283: done running TaskExecutor() for managed_node3/TASK: Show NetworkManager version [0affcac9-a3a5-e081-a588-00000000280b] 30575 1726867688.77288: sending task result for task 0affcac9-a3a5-e081-a588-00000000280b 30575 1726867688.77367: done sending task result for task 0affcac9-a3a5-e081-a588-00000000280b 30575 1726867688.77370: WORKER PROCESS EXITING ok: [managed_node3] => { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" } 30575 1726867688.77430: no more pending results, returning what we have 30575 1726867688.77433: results queue empty 30575 1726867688.77434: checking for any_errors_fatal 30575 1726867688.77439: done checking for any_errors_fatal 30575 1726867688.77440: checking for max_fail_percentage 30575 1726867688.77441: done checking for max_fail_percentage 30575 1726867688.77442: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.77443: done checking to see if all hosts have failed 30575 1726867688.77443: getting the remaining hosts for this loop 30575 1726867688.77445: done getting the remaining hosts for this loop 30575 1726867688.77448: getting the next task for host managed_node3 30575 1726867688.77456: done getting next task for host managed_node3 30575 1726867688.77460: ^ task is: TASK: Conditional asserts 30575 1726867688.77462: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.77466: getting variables 30575 1726867688.77467: in VariableManager get_vars() 30575 1726867688.77503: Calling all_inventory to load vars for managed_node3 30575 1726867688.77505: Calling groups_inventory to load vars for managed_node3 30575 1726867688.77508: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.77516: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.77521: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.77524: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.78383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.79231: done with get_vars() 30575 1726867688.79245: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:28:08 -0400 (0:00:00.034) 0:02:04.170 ****** 30575 1726867688.79309: entering _queue_task() for managed_node3/include_tasks 30575 1726867688.79513: worker is 1 (out of 1 available) 30575 1726867688.79530: exiting _queue_task() for managed_node3/include_tasks 30575 1726867688.79542: done queuing things up, now waiting for results queue to drain 30575 1726867688.79544: waiting for pending results... 30575 1726867688.79722: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30575 1726867688.79797: in run() - task 0affcac9-a3a5-e081-a588-0000000020b3 30575 1726867688.79807: variable 'ansible_search_path' from source: unknown 30575 1726867688.79812: variable 'ansible_search_path' from source: unknown 30575 1726867688.80014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30575 1726867688.81465: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30575 1726867688.81513: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30575 1726867688.81541: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30575 1726867688.81565: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30575 1726867688.81586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30575 1726867688.81656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30575 1726867688.81679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30575 1726867688.81697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30575 1726867688.81726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30575 1726867688.81737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30575 1726867688.81813: variable 'lsr_assert_when' from source: include params 30575 1726867688.81891: variable 'network_provider' from source: set_fact 30575 1726867688.81948: variable 'omit' from source: magic vars 30575 1726867688.82025: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.82032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.82045: variable 'omit' from source: magic vars 30575 1726867688.82179: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.82185: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.82263: variable 'item' from source: unknown 30575 1726867688.82266: Evaluated conditional (item['condition']): True 30575 1726867688.82323: variable 'item' from source: unknown 30575 1726867688.82342: variable 'item' from source: unknown 30575 1726867688.82394: variable 'item' from source: unknown 30575 1726867688.82532: dumping result to json 30575 1726867688.82536: done dumping result, returning 30575 1726867688.82538: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-e081-a588-0000000020b3] 30575 1726867688.82540: sending task result for task 0affcac9-a3a5-e081-a588-0000000020b3 30575 1726867688.82581: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020b3 30575 1726867688.82606: no more pending results, returning what we have 30575 1726867688.82611: in VariableManager get_vars() 30575 1726867688.82655: Calling all_inventory to load vars for managed_node3 30575 1726867688.82658: Calling groups_inventory to load vars for managed_node3 30575 1726867688.82661: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.82670: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.82672: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.82675: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.83247: WORKER PROCESS EXITING 30575 1726867688.83449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.84401: done with get_vars() 30575 1726867688.84414: variable 'ansible_search_path' from source: unknown 30575 1726867688.84415: variable 'ansible_search_path' from source: unknown 30575 1726867688.84441: we have included files to process 30575 1726867688.84442: generating all_blocks data 30575 1726867688.84443: done generating all_blocks data 30575 1726867688.84447: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867688.84447: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867688.84449: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30575 1726867688.84519: in VariableManager get_vars() 30575 1726867688.84533: done with get_vars() 30575 1726867688.84605: done processing included file 30575 1726867688.84606: iterating over new_blocks loaded from include file 30575 1726867688.84607: in VariableManager get_vars() 30575 1726867688.84620: done with get_vars() 30575 1726867688.84621: filtering new block on tags 30575 1726867688.84641: done filtering new block on tags 30575 1726867688.84643: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30575 1726867688.84646: extending task lists for all hosts with included blocks 30575 1726867688.85408: done extending task lists 30575 1726867688.85409: done processing included files 30575 1726867688.85410: results queue empty 30575 1726867688.85410: checking for any_errors_fatal 30575 1726867688.85413: done checking for any_errors_fatal 30575 1726867688.85413: checking for max_fail_percentage 30575 1726867688.85414: done checking for max_fail_percentage 30575 1726867688.85414: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.85415: done checking to see if all hosts have failed 30575 1726867688.85416: getting the remaining hosts for this loop 30575 1726867688.85416: done getting the remaining hosts for this loop 30575 1726867688.85420: getting the next task for host managed_node3 30575 1726867688.85423: done getting next task for host managed_node3 30575 1726867688.85424: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30575 1726867688.85426: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.85431: getting variables 30575 1726867688.85432: in VariableManager get_vars() 30575 1726867688.85441: Calling all_inventory to load vars for managed_node3 30575 1726867688.85443: Calling groups_inventory to load vars for managed_node3 30575 1726867688.85444: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.85448: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.85449: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.85451: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.86086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.86914: done with get_vars() 30575 1726867688.86931: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:28:08 -0400 (0:00:00.076) 0:02:04.247 ****** 30575 1726867688.86982: entering _queue_task() for managed_node3/include_tasks 30575 1726867688.87229: worker is 1 (out of 1 available) 30575 1726867688.87243: exiting _queue_task() for managed_node3/include_tasks 30575 1726867688.87256: done queuing things up, now waiting for results queue to drain 30575 1726867688.87258: waiting for pending results... 30575 1726867688.87444: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30575 1726867688.87525: in run() - task 0affcac9-a3a5-e081-a588-0000000028d3 30575 1726867688.87537: variable 'ansible_search_path' from source: unknown 30575 1726867688.87541: variable 'ansible_search_path' from source: unknown 30575 1726867688.87570: calling self._execute() 30575 1726867688.87655: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.87659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.87669: variable 'omit' from source: magic vars 30575 1726867688.87960: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.87969: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.87975: _execute() done 30575 1726867688.87981: dumping result to json 30575 1726867688.87984: done dumping result, returning 30575 1726867688.87991: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-e081-a588-0000000028d3] 30575 1726867688.87997: sending task result for task 0affcac9-a3a5-e081-a588-0000000028d3 30575 1726867688.88088: done sending task result for task 0affcac9-a3a5-e081-a588-0000000028d3 30575 1726867688.88091: WORKER PROCESS EXITING 30575 1726867688.88120: no more pending results, returning what we have 30575 1726867688.88125: in VariableManager get_vars() 30575 1726867688.88184: Calling all_inventory to load vars for managed_node3 30575 1726867688.88186: Calling groups_inventory to load vars for managed_node3 30575 1726867688.88190: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.88203: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.88206: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.88208: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.89143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.89990: done with get_vars() 30575 1726867688.90003: variable 'ansible_search_path' from source: unknown 30575 1726867688.90004: variable 'ansible_search_path' from source: unknown 30575 1726867688.90102: variable 'item' from source: include params 30575 1726867688.90129: we have included files to process 30575 1726867688.90130: generating all_blocks data 30575 1726867688.90131: done generating all_blocks data 30575 1726867688.90132: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867688.90133: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867688.90135: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30575 1726867688.90260: done processing included file 30575 1726867688.90262: iterating over new_blocks loaded from include file 30575 1726867688.90263: in VariableManager get_vars() 30575 1726867688.90274: done with get_vars() 30575 1726867688.90275: filtering new block on tags 30575 1726867688.90292: done filtering new block on tags 30575 1726867688.90294: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30575 1726867688.90297: extending task lists for all hosts with included blocks 30575 1726867688.90390: done extending task lists 30575 1726867688.90391: done processing included files 30575 1726867688.90391: results queue empty 30575 1726867688.90392: checking for any_errors_fatal 30575 1726867688.90395: done checking for any_errors_fatal 30575 1726867688.90395: checking for max_fail_percentage 30575 1726867688.90396: done checking for max_fail_percentage 30575 1726867688.90396: checking to see if all hosts have failed and the running result is not ok 30575 1726867688.90397: done checking to see if all hosts have failed 30575 1726867688.90397: getting the remaining hosts for this loop 30575 1726867688.90398: done getting the remaining hosts for this loop 30575 1726867688.90400: getting the next task for host managed_node3 30575 1726867688.90403: done getting next task for host managed_node3 30575 1726867688.90404: ^ task is: TASK: Get stat for interface {{ interface }} 30575 1726867688.90406: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867688.90408: getting variables 30575 1726867688.90408: in VariableManager get_vars() 30575 1726867688.90415: Calling all_inventory to load vars for managed_node3 30575 1726867688.90419: Calling groups_inventory to load vars for managed_node3 30575 1726867688.90420: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867688.90424: Calling all_plugins_play to load vars for managed_node3 30575 1726867688.90425: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867688.90427: Calling groups_plugins_play to load vars for managed_node3 30575 1726867688.91060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867688.91887: done with get_vars() 30575 1726867688.91902: done getting variables 30575 1726867688.91982: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:28:08 -0400 (0:00:00.050) 0:02:04.297 ****** 30575 1726867688.92004: entering _queue_task() for managed_node3/stat 30575 1726867688.92248: worker is 1 (out of 1 available) 30575 1726867688.92264: exiting _queue_task() for managed_node3/stat 30575 1726867688.92279: done queuing things up, now waiting for results queue to drain 30575 1726867688.92281: waiting for pending results... 30575 1726867688.92461: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30575 1726867688.92543: in run() - task 0affcac9-a3a5-e081-a588-000000002979 30575 1726867688.92554: variable 'ansible_search_path' from source: unknown 30575 1726867688.92557: variable 'ansible_search_path' from source: unknown 30575 1726867688.92590: calling self._execute() 30575 1726867688.92669: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.92672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.92685: variable 'omit' from source: magic vars 30575 1726867688.92962: variable 'ansible_distribution_major_version' from source: facts 30575 1726867688.92971: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867688.92979: variable 'omit' from source: magic vars 30575 1726867688.93012: variable 'omit' from source: magic vars 30575 1726867688.93082: variable 'interface' from source: play vars 30575 1726867688.93097: variable 'omit' from source: magic vars 30575 1726867688.93130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867688.93166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867688.93183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867688.93196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.93206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867688.93232: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867688.93235: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.93238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.93308: Set connection var ansible_pipelining to False 30575 1726867688.93312: Set connection var ansible_shell_type to sh 30575 1726867688.93317: Set connection var ansible_shell_executable to /bin/sh 30575 1726867688.93324: Set connection var ansible_timeout to 10 30575 1726867688.93329: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867688.93336: Set connection var ansible_connection to ssh 30575 1726867688.93353: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.93357: variable 'ansible_connection' from source: unknown 30575 1726867688.93359: variable 'ansible_module_compression' from source: unknown 30575 1726867688.93362: variable 'ansible_shell_type' from source: unknown 30575 1726867688.93364: variable 'ansible_shell_executable' from source: unknown 30575 1726867688.93366: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867688.93369: variable 'ansible_pipelining' from source: unknown 30575 1726867688.93373: variable 'ansible_timeout' from source: unknown 30575 1726867688.93375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867688.93517: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30575 1726867688.93528: variable 'omit' from source: magic vars 30575 1726867688.93534: starting attempt loop 30575 1726867688.93536: running the handler 30575 1726867688.93548: _low_level_execute_command(): starting 30575 1726867688.93556: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867688.94069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.94074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867688.94079: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.94133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867688.94140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867688.94143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.94194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.95867: stdout chunk (state=3): >>>/root <<< 30575 1726867688.95990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.95994: stdout chunk (state=3): >>><<< 30575 1726867688.96003: stderr chunk (state=3): >>><<< 30575 1726867688.96019: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867688.96032: _low_level_execute_command(): starting 30575 1726867688.96038: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653 `" && echo ansible-tmp-1726867688.9602098-35820-212450894507653="` echo /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653 `" ) && sleep 0' 30575 1726867688.96470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.96474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.96476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867688.96487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867688.96490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.96529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867688.96536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.96587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867688.98453: stdout chunk (state=3): >>>ansible-tmp-1726867688.9602098-35820-212450894507653=/root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653 <<< 30575 1726867688.98560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867688.98583: stderr chunk (state=3): >>><<< 30575 1726867688.98587: stdout chunk (state=3): >>><<< 30575 1726867688.98602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867688.9602098-35820-212450894507653=/root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867688.98638: variable 'ansible_module_compression' from source: unknown 30575 1726867688.98679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30575 1726867688.98707: variable 'ansible_facts' from source: unknown 30575 1726867688.98771: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/AnsiballZ_stat.py 30575 1726867688.98863: Sending initial data 30575 1726867688.98866: Sent initial data (153 bytes) 30575 1726867688.99281: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867688.99285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.99287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867688.99289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867688.99304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867688.99346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867688.99350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867688.99401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.00928: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867689.00932: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867689.00966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867689.01018: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3pj3p95s /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/AnsiballZ_stat.py <<< 30575 1726867689.01023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/AnsiballZ_stat.py" <<< 30575 1726867689.01055: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp3pj3p95s" to remote "/root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/AnsiballZ_stat.py" <<< 30575 1726867689.01576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.01611: stderr chunk (state=3): >>><<< 30575 1726867689.01615: stdout chunk (state=3): >>><<< 30575 1726867689.01645: done transferring module to remote 30575 1726867689.01652: _low_level_execute_command(): starting 30575 1726867689.01656: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/ /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/AnsiballZ_stat.py && sleep 0' 30575 1726867689.02063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.02067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867689.02069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.02071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.02073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.02125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.02132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.02176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.03893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.03919: stderr chunk (state=3): >>><<< 30575 1726867689.03923: stdout chunk (state=3): >>><<< 30575 1726867689.03931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.03935: _low_level_execute_command(): starting 30575 1726867689.03937: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/AnsiballZ_stat.py && sleep 0' 30575 1726867689.04337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.04340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.04342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867689.04344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867689.04348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867689.04350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.04395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.04399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.04451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.19639: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30575 1726867689.20923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867689.20949: stderr chunk (state=3): >>><<< 30575 1726867689.20952: stdout chunk (state=3): >>><<< 30575 1726867689.20966: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867689.20992: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867689.21001: _low_level_execute_command(): starting 30575 1726867689.21004: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867688.9602098-35820-212450894507653/ > /dev/null 2>&1 && sleep 0' 30575 1726867689.21445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867689.21448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.21451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867689.21453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.21508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.21520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867689.21522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.21556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.23375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.23401: stderr chunk (state=3): >>><<< 30575 1726867689.23404: stdout chunk (state=3): >>><<< 30575 1726867689.23416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.23422: handler run complete 30575 1726867689.23437: attempt loop complete, returning result 30575 1726867689.23440: _execute() done 30575 1726867689.23442: dumping result to json 30575 1726867689.23446: done dumping result, returning 30575 1726867689.23454: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcac9-a3a5-e081-a588-000000002979] 30575 1726867689.23458: sending task result for task 0affcac9-a3a5-e081-a588-000000002979 30575 1726867689.23554: done sending task result for task 0affcac9-a3a5-e081-a588-000000002979 30575 1726867689.23557: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30575 1726867689.23614: no more pending results, returning what we have 30575 1726867689.23621: results queue empty 30575 1726867689.23621: checking for any_errors_fatal 30575 1726867689.23623: done checking for any_errors_fatal 30575 1726867689.23623: checking for max_fail_percentage 30575 1726867689.23625: done checking for max_fail_percentage 30575 1726867689.23626: checking to see if all hosts have failed and the running result is not ok 30575 1726867689.23627: done checking to see if all hosts have failed 30575 1726867689.23628: getting the remaining hosts for this loop 30575 1726867689.23629: done getting the remaining hosts for this loop 30575 1726867689.23633: getting the next task for host managed_node3 30575 1726867689.23644: done getting next task for host managed_node3 30575 1726867689.23647: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30575 1726867689.23651: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867689.23656: getting variables 30575 1726867689.23658: in VariableManager get_vars() 30575 1726867689.23709: Calling all_inventory to load vars for managed_node3 30575 1726867689.23712: Calling groups_inventory to load vars for managed_node3 30575 1726867689.23715: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867689.23728: Calling all_plugins_play to load vars for managed_node3 30575 1726867689.23730: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867689.23732: Calling groups_plugins_play to load vars for managed_node3 30575 1726867689.28740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867689.29572: done with get_vars() 30575 1726867689.29590: done getting variables 30575 1726867689.29625: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867689.29692: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:28:09 -0400 (0:00:00.377) 0:02:04.674 ****** 30575 1726867689.29711: entering _queue_task() for managed_node3/assert 30575 1726867689.29983: worker is 1 (out of 1 available) 30575 1726867689.29999: exiting _queue_task() for managed_node3/assert 30575 1726867689.30012: done queuing things up, now waiting for results queue to drain 30575 1726867689.30014: waiting for pending results... 30575 1726867689.30197: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30575 1726867689.30288: in run() - task 0affcac9-a3a5-e081-a588-0000000028d4 30575 1726867689.30300: variable 'ansible_search_path' from source: unknown 30575 1726867689.30303: variable 'ansible_search_path' from source: unknown 30575 1726867689.30334: calling self._execute() 30575 1726867689.30414: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.30422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.30427: variable 'omit' from source: magic vars 30575 1726867689.30699: variable 'ansible_distribution_major_version' from source: facts 30575 1726867689.30709: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867689.30715: variable 'omit' from source: magic vars 30575 1726867689.30748: variable 'omit' from source: magic vars 30575 1726867689.30822: variable 'interface' from source: play vars 30575 1726867689.30834: variable 'omit' from source: magic vars 30575 1726867689.30869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867689.30902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867689.30915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867689.30929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.30939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.30962: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867689.30965: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.30967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.31040: Set connection var ansible_pipelining to False 30575 1726867689.31043: Set connection var ansible_shell_type to sh 30575 1726867689.31049: Set connection var ansible_shell_executable to /bin/sh 30575 1726867689.31054: Set connection var ansible_timeout to 10 30575 1726867689.31059: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867689.31065: Set connection var ansible_connection to ssh 30575 1726867689.31086: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.31089: variable 'ansible_connection' from source: unknown 30575 1726867689.31092: variable 'ansible_module_compression' from source: unknown 30575 1726867689.31094: variable 'ansible_shell_type' from source: unknown 30575 1726867689.31097: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.31099: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.31101: variable 'ansible_pipelining' from source: unknown 30575 1726867689.31103: variable 'ansible_timeout' from source: unknown 30575 1726867689.31108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.31205: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867689.31216: variable 'omit' from source: magic vars 30575 1726867689.31223: starting attempt loop 30575 1726867689.31226: running the handler 30575 1726867689.31326: variable 'interface_stat' from source: set_fact 30575 1726867689.31333: Evaluated conditional (not interface_stat.stat.exists): True 30575 1726867689.31340: handler run complete 30575 1726867689.31356: attempt loop complete, returning result 30575 1726867689.31359: _execute() done 30575 1726867689.31361: dumping result to json 30575 1726867689.31364: done dumping result, returning 30575 1726867689.31370: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcac9-a3a5-e081-a588-0000000028d4] 30575 1726867689.31375: sending task result for task 0affcac9-a3a5-e081-a588-0000000028d4 30575 1726867689.31461: done sending task result for task 0affcac9-a3a5-e081-a588-0000000028d4 30575 1726867689.31464: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30575 1726867689.31511: no more pending results, returning what we have 30575 1726867689.31515: results queue empty 30575 1726867689.31515: checking for any_errors_fatal 30575 1726867689.31525: done checking for any_errors_fatal 30575 1726867689.31525: checking for max_fail_percentage 30575 1726867689.31527: done checking for max_fail_percentage 30575 1726867689.31528: checking to see if all hosts have failed and the running result is not ok 30575 1726867689.31529: done checking to see if all hosts have failed 30575 1726867689.31529: getting the remaining hosts for this loop 30575 1726867689.31531: done getting the remaining hosts for this loop 30575 1726867689.31534: getting the next task for host managed_node3 30575 1726867689.31543: done getting next task for host managed_node3 30575 1726867689.31545: ^ task is: TASK: Success in test '{{ lsr_description }}' 30575 1726867689.31548: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867689.31552: getting variables 30575 1726867689.31553: in VariableManager get_vars() 30575 1726867689.31600: Calling all_inventory to load vars for managed_node3 30575 1726867689.31603: Calling groups_inventory to load vars for managed_node3 30575 1726867689.31607: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867689.31616: Calling all_plugins_play to load vars for managed_node3 30575 1726867689.31619: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867689.31622: Calling groups_plugins_play to load vars for managed_node3 30575 1726867689.32392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867689.33254: done with get_vars() 30575 1726867689.33270: done getting variables 30575 1726867689.33308: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30575 1726867689.33387: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:28:09 -0400 (0:00:00.036) 0:02:04.711 ****** 30575 1726867689.33407: entering _queue_task() for managed_node3/debug 30575 1726867689.33612: worker is 1 (out of 1 available) 30575 1726867689.33627: exiting _queue_task() for managed_node3/debug 30575 1726867689.33641: done queuing things up, now waiting for results queue to drain 30575 1726867689.33643: waiting for pending results... 30575 1726867689.33823: running TaskExecutor() for managed_node3/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 30575 1726867689.33904: in run() - task 0affcac9-a3a5-e081-a588-0000000020b4 30575 1726867689.33916: variable 'ansible_search_path' from source: unknown 30575 1726867689.33922: variable 'ansible_search_path' from source: unknown 30575 1726867689.33949: calling self._execute() 30575 1726867689.34027: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.34032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.34042: variable 'omit' from source: magic vars 30575 1726867689.34304: variable 'ansible_distribution_major_version' from source: facts 30575 1726867689.34315: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867689.34322: variable 'omit' from source: magic vars 30575 1726867689.34347: variable 'omit' from source: magic vars 30575 1726867689.34423: variable 'lsr_description' from source: include params 30575 1726867689.34436: variable 'omit' from source: magic vars 30575 1726867689.34468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867689.34496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867689.34511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867689.34527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.34537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.34559: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867689.34562: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.34564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.34631: Set connection var ansible_pipelining to False 30575 1726867689.34634: Set connection var ansible_shell_type to sh 30575 1726867689.34640: Set connection var ansible_shell_executable to /bin/sh 30575 1726867689.34645: Set connection var ansible_timeout to 10 30575 1726867689.34650: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867689.34656: Set connection var ansible_connection to ssh 30575 1726867689.34673: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.34676: variable 'ansible_connection' from source: unknown 30575 1726867689.34681: variable 'ansible_module_compression' from source: unknown 30575 1726867689.34683: variable 'ansible_shell_type' from source: unknown 30575 1726867689.34686: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.34688: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.34690: variable 'ansible_pipelining' from source: unknown 30575 1726867689.34693: variable 'ansible_timeout' from source: unknown 30575 1726867689.34695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.34792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867689.34801: variable 'omit' from source: magic vars 30575 1726867689.34805: starting attempt loop 30575 1726867689.34808: running the handler 30575 1726867689.34849: handler run complete 30575 1726867689.34857: attempt loop complete, returning result 30575 1726867689.34860: _execute() done 30575 1726867689.34863: dumping result to json 30575 1726867689.34865: done dumping result, returning 30575 1726867689.34873: done running TaskExecutor() for managed_node3/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [0affcac9-a3a5-e081-a588-0000000020b4] 30575 1726867689.34879: sending task result for task 0affcac9-a3a5-e081-a588-0000000020b4 30575 1726867689.34960: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020b4 30575 1726867689.34963: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 30575 1726867689.35006: no more pending results, returning what we have 30575 1726867689.35009: results queue empty 30575 1726867689.35010: checking for any_errors_fatal 30575 1726867689.35019: done checking for any_errors_fatal 30575 1726867689.35020: checking for max_fail_percentage 30575 1726867689.35022: done checking for max_fail_percentage 30575 1726867689.35023: checking to see if all hosts have failed and the running result is not ok 30575 1726867689.35024: done checking to see if all hosts have failed 30575 1726867689.35024: getting the remaining hosts for this loop 30575 1726867689.35026: done getting the remaining hosts for this loop 30575 1726867689.35029: getting the next task for host managed_node3 30575 1726867689.35037: done getting next task for host managed_node3 30575 1726867689.35039: ^ task is: TASK: Cleanup 30575 1726867689.35042: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867689.35046: getting variables 30575 1726867689.35047: in VariableManager get_vars() 30575 1726867689.35089: Calling all_inventory to load vars for managed_node3 30575 1726867689.35092: Calling groups_inventory to load vars for managed_node3 30575 1726867689.35095: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867689.35104: Calling all_plugins_play to load vars for managed_node3 30575 1726867689.35106: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867689.35109: Calling groups_plugins_play to load vars for managed_node3 30575 1726867689.35953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867689.36798: done with get_vars() 30575 1726867689.36813: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:28:09 -0400 (0:00:00.034) 0:02:04.746 ****** 30575 1726867689.36874: entering _queue_task() for managed_node3/include_tasks 30575 1726867689.37065: worker is 1 (out of 1 available) 30575 1726867689.37080: exiting _queue_task() for managed_node3/include_tasks 30575 1726867689.37092: done queuing things up, now waiting for results queue to drain 30575 1726867689.37093: waiting for pending results... 30575 1726867689.37257: running TaskExecutor() for managed_node3/TASK: Cleanup 30575 1726867689.37333: in run() - task 0affcac9-a3a5-e081-a588-0000000020b8 30575 1726867689.37343: variable 'ansible_search_path' from source: unknown 30575 1726867689.37347: variable 'ansible_search_path' from source: unknown 30575 1726867689.37381: variable 'lsr_cleanup' from source: include params 30575 1726867689.37524: variable 'lsr_cleanup' from source: include params 30575 1726867689.37576: variable 'omit' from source: magic vars 30575 1726867689.37672: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.37683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.37691: variable 'omit' from source: magic vars 30575 1726867689.37853: variable 'ansible_distribution_major_version' from source: facts 30575 1726867689.37864: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867689.37868: variable 'item' from source: unknown 30575 1726867689.37915: variable 'item' from source: unknown 30575 1726867689.37936: variable 'item' from source: unknown 30575 1726867689.37980: variable 'item' from source: unknown 30575 1726867689.38100: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.38104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.38107: variable 'omit' from source: magic vars 30575 1726867689.38181: variable 'ansible_distribution_major_version' from source: facts 30575 1726867689.38184: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867689.38190: variable 'item' from source: unknown 30575 1726867689.38232: variable 'item' from source: unknown 30575 1726867689.38252: variable 'item' from source: unknown 30575 1726867689.38294: variable 'item' from source: unknown 30575 1726867689.38357: dumping result to json 30575 1726867689.38360: done dumping result, returning 30575 1726867689.38362: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-e081-a588-0000000020b8] 30575 1726867689.38364: sending task result for task 0affcac9-a3a5-e081-a588-0000000020b8 30575 1726867689.38430: no more pending results, returning what we have 30575 1726867689.38435: in VariableManager get_vars() 30575 1726867689.38484: Calling all_inventory to load vars for managed_node3 30575 1726867689.38488: Calling groups_inventory to load vars for managed_node3 30575 1726867689.38492: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867689.38503: Calling all_plugins_play to load vars for managed_node3 30575 1726867689.38505: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867689.38508: Calling groups_plugins_play to load vars for managed_node3 30575 1726867689.39247: done sending task result for task 0affcac9-a3a5-e081-a588-0000000020b8 30575 1726867689.39251: WORKER PROCESS EXITING 30575 1726867689.39261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867689.40200: done with get_vars() 30575 1726867689.40213: variable 'ansible_search_path' from source: unknown 30575 1726867689.40214: variable 'ansible_search_path' from source: unknown 30575 1726867689.40239: variable 'ansible_search_path' from source: unknown 30575 1726867689.40240: variable 'ansible_search_path' from source: unknown 30575 1726867689.40256: we have included files to process 30575 1726867689.40256: generating all_blocks data 30575 1726867689.40258: done generating all_blocks data 30575 1726867689.40260: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867689.40260: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867689.40262: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30575 1726867689.40380: done processing included file 30575 1726867689.40382: iterating over new_blocks loaded from include file 30575 1726867689.40382: in VariableManager get_vars() 30575 1726867689.40393: done with get_vars() 30575 1726867689.40394: filtering new block on tags 30575 1726867689.40410: done filtering new block on tags 30575 1726867689.40412: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30575 1726867689.40414: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30575 1726867689.40415: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30575 1726867689.40417: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30575 1726867689.40636: done processing included file 30575 1726867689.40637: iterating over new_blocks loaded from include file 30575 1726867689.40638: in VariableManager get_vars() 30575 1726867689.40648: done with get_vars() 30575 1726867689.40649: filtering new block on tags 30575 1726867689.40667: done filtering new block on tags 30575 1726867689.40668: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 => (item=tasks/check_network_dns.yml) 30575 1726867689.40670: extending task lists for all hosts with included blocks 30575 1726867689.41549: done extending task lists 30575 1726867689.41550: done processing included files 30575 1726867689.41551: results queue empty 30575 1726867689.41551: checking for any_errors_fatal 30575 1726867689.41554: done checking for any_errors_fatal 30575 1726867689.41554: checking for max_fail_percentage 30575 1726867689.41555: done checking for max_fail_percentage 30575 1726867689.41555: checking to see if all hosts have failed and the running result is not ok 30575 1726867689.41556: done checking to see if all hosts have failed 30575 1726867689.41556: getting the remaining hosts for this loop 30575 1726867689.41557: done getting the remaining hosts for this loop 30575 1726867689.41559: getting the next task for host managed_node3 30575 1726867689.41561: done getting next task for host managed_node3 30575 1726867689.41562: ^ task is: TASK: Cleanup profile and device 30575 1726867689.41564: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867689.41566: getting variables 30575 1726867689.41566: in VariableManager get_vars() 30575 1726867689.41574: Calling all_inventory to load vars for managed_node3 30575 1726867689.41581: Calling groups_inventory to load vars for managed_node3 30575 1726867689.41582: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867689.41586: Calling all_plugins_play to load vars for managed_node3 30575 1726867689.41587: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867689.41589: Calling groups_plugins_play to load vars for managed_node3 30575 1726867689.42193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867689.43018: done with get_vars() 30575 1726867689.43033: done getting variables 30575 1726867689.43058: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 17:28:09 -0400 (0:00:00.062) 0:02:04.808 ****** 30575 1726867689.43079: entering _queue_task() for managed_node3/shell 30575 1726867689.43294: worker is 1 (out of 1 available) 30575 1726867689.43309: exiting _queue_task() for managed_node3/shell 30575 1726867689.43322: done queuing things up, now waiting for results queue to drain 30575 1726867689.43324: waiting for pending results... 30575 1726867689.43508: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30575 1726867689.43579: in run() - task 0affcac9-a3a5-e081-a588-00000000299e 30575 1726867689.43591: variable 'ansible_search_path' from source: unknown 30575 1726867689.43594: variable 'ansible_search_path' from source: unknown 30575 1726867689.43623: calling self._execute() 30575 1726867689.43702: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.43707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.43716: variable 'omit' from source: magic vars 30575 1726867689.43986: variable 'ansible_distribution_major_version' from source: facts 30575 1726867689.43994: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867689.43999: variable 'omit' from source: magic vars 30575 1726867689.44033: variable 'omit' from source: magic vars 30575 1726867689.44141: variable 'interface' from source: play vars 30575 1726867689.44156: variable 'omit' from source: magic vars 30575 1726867689.44190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867689.44219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867689.44236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867689.44250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.44261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.44286: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867689.44289: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.44293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.44362: Set connection var ansible_pipelining to False 30575 1726867689.44366: Set connection var ansible_shell_type to sh 30575 1726867689.44369: Set connection var ansible_shell_executable to /bin/sh 30575 1726867689.44375: Set connection var ansible_timeout to 10 30575 1726867689.44381: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867689.44388: Set connection var ansible_connection to ssh 30575 1726867689.44405: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.44408: variable 'ansible_connection' from source: unknown 30575 1726867689.44412: variable 'ansible_module_compression' from source: unknown 30575 1726867689.44415: variable 'ansible_shell_type' from source: unknown 30575 1726867689.44417: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.44419: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.44423: variable 'ansible_pipelining' from source: unknown 30575 1726867689.44425: variable 'ansible_timeout' from source: unknown 30575 1726867689.44433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.44528: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867689.44541: variable 'omit' from source: magic vars 30575 1726867689.44545: starting attempt loop 30575 1726867689.44547: running the handler 30575 1726867689.44554: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867689.44570: _low_level_execute_command(): starting 30575 1726867689.44576: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867689.45088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.45092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.45095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.45097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.45149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.45152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867689.45154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.45209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.46908: stdout chunk (state=3): >>>/root <<< 30575 1726867689.47007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.47035: stderr chunk (state=3): >>><<< 30575 1726867689.47038: stdout chunk (state=3): >>><<< 30575 1726867689.47059: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.47071: _low_level_execute_command(): starting 30575 1726867689.47076: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335 `" && echo ansible-tmp-1726867689.4705927-35831-33217531463335="` echo /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335 `" ) && sleep 0' 30575 1726867689.47482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.47492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.47509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.47512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.47549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.47553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.47610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.49473: stdout chunk (state=3): >>>ansible-tmp-1726867689.4705927-35831-33217531463335=/root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335 <<< 30575 1726867689.49579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.49602: stderr chunk (state=3): >>><<< 30575 1726867689.49605: stdout chunk (state=3): >>><<< 30575 1726867689.49616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867689.4705927-35831-33217531463335=/root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.49644: variable 'ansible_module_compression' from source: unknown 30575 1726867689.49684: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867689.49714: variable 'ansible_facts' from source: unknown 30575 1726867689.49772: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/AnsiballZ_command.py 30575 1726867689.49862: Sending initial data 30575 1726867689.49865: Sent initial data (155 bytes) 30575 1726867689.50296: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.50299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867689.50301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30575 1726867689.50306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 30575 1726867689.50308: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.50356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867689.50359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.50408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.51939: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867689.51946: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867689.51981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867689.52029: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp08k0wm7p /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/AnsiballZ_command.py <<< 30575 1726867689.52032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/AnsiballZ_command.py" <<< 30575 1726867689.52066: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp08k0wm7p" to remote "/root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/AnsiballZ_command.py" <<< 30575 1726867689.52609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.52641: stderr chunk (state=3): >>><<< 30575 1726867689.52645: stdout chunk (state=3): >>><<< 30575 1726867689.52683: done transferring module to remote 30575 1726867689.52690: _low_level_execute_command(): starting 30575 1726867689.52695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/ /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/AnsiballZ_command.py && sleep 0' 30575 1726867689.53082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.53086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.53102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.53154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.53157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.53206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.54938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.54958: stderr chunk (state=3): >>><<< 30575 1726867689.54961: stdout chunk (state=3): >>><<< 30575 1726867689.54973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.54978: _low_level_execute_command(): starting 30575 1726867689.54981: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/AnsiballZ_command.py && sleep 0' 30575 1726867689.55372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.55375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.55378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867689.55381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867689.55383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.55431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.55438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.55483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.73665: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:28:09.703325", "end": "2024-09-20 17:28:09.734533", "delta": "0:00:00.031208", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867689.75091: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 30575 1726867689.75116: stderr chunk (state=3): >>><<< 30575 1726867689.75122: stdout chunk (state=3): >>><<< 30575 1726867689.75138: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 17:28:09.703325", "end": "2024-09-20 17:28:09.734533", "delta": "0:00:00.031208", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 30575 1726867689.75170: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867689.75181: _low_level_execute_command(): starting 30575 1726867689.75184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867689.4705927-35831-33217531463335/ > /dev/null 2>&1 && sleep 0' 30575 1726867689.75615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.75621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.75637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.75683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.75697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.75745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.77539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.77562: stderr chunk (state=3): >>><<< 30575 1726867689.77565: stdout chunk (state=3): >>><<< 30575 1726867689.77579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.77586: handler run complete 30575 1726867689.77603: Evaluated conditional (False): False 30575 1726867689.77611: attempt loop complete, returning result 30575 1726867689.77614: _execute() done 30575 1726867689.77616: dumping result to json 30575 1726867689.77624: done dumping result, returning 30575 1726867689.77631: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcac9-a3a5-e081-a588-00000000299e] 30575 1726867689.77636: sending task result for task 0affcac9-a3a5-e081-a588-00000000299e 30575 1726867689.77729: done sending task result for task 0affcac9-a3a5-e081-a588-00000000299e 30575 1726867689.77731: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.031208", "end": "2024-09-20 17:28:09.734533", "rc": 1, "start": "2024-09-20 17:28:09.703325" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30575 1726867689.77794: no more pending results, returning what we have 30575 1726867689.77799: results queue empty 30575 1726867689.77800: checking for any_errors_fatal 30575 1726867689.77801: done checking for any_errors_fatal 30575 1726867689.77802: checking for max_fail_percentage 30575 1726867689.77803: done checking for max_fail_percentage 30575 1726867689.77804: checking to see if all hosts have failed and the running result is not ok 30575 1726867689.77805: done checking to see if all hosts have failed 30575 1726867689.77806: getting the remaining hosts for this loop 30575 1726867689.77807: done getting the remaining hosts for this loop 30575 1726867689.77810: getting the next task for host managed_node3 30575 1726867689.77820: done getting next task for host managed_node3 30575 1726867689.77823: ^ task is: TASK: Check routes and DNS 30575 1726867689.77826: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867689.77831: getting variables 30575 1726867689.77832: in VariableManager get_vars() 30575 1726867689.77885: Calling all_inventory to load vars for managed_node3 30575 1726867689.77888: Calling groups_inventory to load vars for managed_node3 30575 1726867689.77891: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867689.77901: Calling all_plugins_play to load vars for managed_node3 30575 1726867689.77904: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867689.77906: Calling groups_plugins_play to load vars for managed_node3 30575 1726867689.78828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867689.79687: done with get_vars() 30575 1726867689.79703: done getting variables 30575 1726867689.79745: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:28:09 -0400 (0:00:00.366) 0:02:05.175 ****** 30575 1726867689.79766: entering _queue_task() for managed_node3/shell 30575 1726867689.79982: worker is 1 (out of 1 available) 30575 1726867689.79997: exiting _queue_task() for managed_node3/shell 30575 1726867689.80010: done queuing things up, now waiting for results queue to drain 30575 1726867689.80012: waiting for pending results... 30575 1726867689.80198: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 30575 1726867689.80278: in run() - task 0affcac9-a3a5-e081-a588-0000000029a2 30575 1726867689.80291: variable 'ansible_search_path' from source: unknown 30575 1726867689.80295: variable 'ansible_search_path' from source: unknown 30575 1726867689.80325: calling self._execute() 30575 1726867689.80402: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.80406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.80415: variable 'omit' from source: magic vars 30575 1726867689.80695: variable 'ansible_distribution_major_version' from source: facts 30575 1726867689.80704: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867689.80710: variable 'omit' from source: magic vars 30575 1726867689.80744: variable 'omit' from source: magic vars 30575 1726867689.80767: variable 'omit' from source: magic vars 30575 1726867689.80801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867689.80830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867689.80846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867689.80859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.80869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867689.80896: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867689.80899: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.80902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.80973: Set connection var ansible_pipelining to False 30575 1726867689.80976: Set connection var ansible_shell_type to sh 30575 1726867689.80980: Set connection var ansible_shell_executable to /bin/sh 30575 1726867689.80987: Set connection var ansible_timeout to 10 30575 1726867689.80991: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867689.81001: Set connection var ansible_connection to ssh 30575 1726867689.81017: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.81023: variable 'ansible_connection' from source: unknown 30575 1726867689.81026: variable 'ansible_module_compression' from source: unknown 30575 1726867689.81028: variable 'ansible_shell_type' from source: unknown 30575 1726867689.81030: variable 'ansible_shell_executable' from source: unknown 30575 1726867689.81033: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867689.81037: variable 'ansible_pipelining' from source: unknown 30575 1726867689.81040: variable 'ansible_timeout' from source: unknown 30575 1726867689.81042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867689.81144: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867689.81153: variable 'omit' from source: magic vars 30575 1726867689.81157: starting attempt loop 30575 1726867689.81160: running the handler 30575 1726867689.81169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867689.81187: _low_level_execute_command(): starting 30575 1726867689.81194: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867689.81697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.81701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867689.81705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.81754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.81758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.81810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.83397: stdout chunk (state=3): >>>/root <<< 30575 1726867689.83495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.83520: stderr chunk (state=3): >>><<< 30575 1726867689.83527: stdout chunk (state=3): >>><<< 30575 1726867689.83548: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.83559: _low_level_execute_command(): starting 30575 1726867689.83564: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340 `" && echo ansible-tmp-1726867689.8354795-35842-72464364688340="` echo /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340 `" ) && sleep 0' 30575 1726867689.83970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.83973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.83984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867689.83986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.84031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.84038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.84082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.85947: stdout chunk (state=3): >>>ansible-tmp-1726867689.8354795-35842-72464364688340=/root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340 <<< 30575 1726867689.86055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.86076: stderr chunk (state=3): >>><<< 30575 1726867689.86082: stdout chunk (state=3): >>><<< 30575 1726867689.86093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867689.8354795-35842-72464364688340=/root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.86119: variable 'ansible_module_compression' from source: unknown 30575 1726867689.86159: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867689.86191: variable 'ansible_facts' from source: unknown 30575 1726867689.86247: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/AnsiballZ_command.py 30575 1726867689.86337: Sending initial data 30575 1726867689.86341: Sent initial data (155 bytes) 30575 1726867689.86758: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.86761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867689.86764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.86766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.86768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.86820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.86826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.86869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.88383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30575 1726867689.88387: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867689.88423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867689.88467: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmplmx35lo8 /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/AnsiballZ_command.py <<< 30575 1726867689.88470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/AnsiballZ_command.py" <<< 30575 1726867689.88511: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmplmx35lo8" to remote "/root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/AnsiballZ_command.py" <<< 30575 1726867689.89041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.89076: stderr chunk (state=3): >>><<< 30575 1726867689.89081: stdout chunk (state=3): >>><<< 30575 1726867689.89116: done transferring module to remote 30575 1726867689.89125: _low_level_execute_command(): starting 30575 1726867689.89127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/ /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/AnsiballZ_command.py && sleep 0' 30575 1726867689.89533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867689.89536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867689.89538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 30575 1726867689.89540: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867689.89545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867689.89547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.89593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867689.89596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.89642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867689.91358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867689.91380: stderr chunk (state=3): >>><<< 30575 1726867689.91383: stdout chunk (state=3): >>><<< 30575 1726867689.91394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867689.91398: _low_level_execute_command(): starting 30575 1726867689.91400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/AnsiballZ_command.py && sleep 0' 30575 1726867689.91782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867689.91786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.91795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867689.91807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867689.91858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867689.91862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867689.91916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.07908: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:de:45:ad:8b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.68/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2695sec preferred_lft 2695sec\n inet6 fe80::8ff:deff:fe45:ad8b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.68 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.68 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:28:10.067714", "end": "2024-09-20 17:28:10.076891", "delta": "0:00:00.009177", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867690.09342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867690.09369: stderr chunk (state=3): >>><<< 30575 1726867690.09372: stdout chunk (state=3): >>><<< 30575 1726867690.09391: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:de:45:ad:8b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.68/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2695sec preferred_lft 2695sec\n inet6 fe80::8ff:deff:fe45:ad8b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.68 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.68 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:28:10.067714", "end": "2024-09-20 17:28:10.076891", "delta": "0:00:00.009177", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867690.09433: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867690.09440: _low_level_execute_command(): starting 30575 1726867690.09445: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867689.8354795-35842-72464364688340/ > /dev/null 2>&1 && sleep 0' 30575 1726867690.09876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.09882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.09897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.09947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867690.09950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867690.10001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.11782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867690.11806: stderr chunk (state=3): >>><<< 30575 1726867690.11809: stdout chunk (state=3): >>><<< 30575 1726867690.11883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867690.11886: handler run complete 30575 1726867690.11888: Evaluated conditional (False): False 30575 1726867690.11891: attempt loop complete, returning result 30575 1726867690.11892: _execute() done 30575 1726867690.11895: dumping result to json 30575 1726867690.11897: done dumping result, returning 30575 1726867690.11899: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affcac9-a3a5-e081-a588-0000000029a2] 30575 1726867690.11901: sending task result for task 0affcac9-a3a5-e081-a588-0000000029a2 30575 1726867690.11967: done sending task result for task 0affcac9-a3a5-e081-a588-0000000029a2 30575 1726867690.11970: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009177", "end": "2024-09-20 17:28:10.076891", "rc": 0, "start": "2024-09-20 17:28:10.067714" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:de:45:ad:8b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.68/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2695sec preferred_lft 2695sec inet6 fe80::8ff:deff:fe45:ad8b/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.68 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.68 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 30575 1726867690.12038: no more pending results, returning what we have 30575 1726867690.12041: results queue empty 30575 1726867690.12042: checking for any_errors_fatal 30575 1726867690.12052: done checking for any_errors_fatal 30575 1726867690.12052: checking for max_fail_percentage 30575 1726867690.12054: done checking for max_fail_percentage 30575 1726867690.12055: checking to see if all hosts have failed and the running result is not ok 30575 1726867690.12056: done checking to see if all hosts have failed 30575 1726867690.12056: getting the remaining hosts for this loop 30575 1726867690.12057: done getting the remaining hosts for this loop 30575 1726867690.12061: getting the next task for host managed_node3 30575 1726867690.12070: done getting next task for host managed_node3 30575 1726867690.12074: ^ task is: TASK: Verify DNS and network connectivity 30575 1726867690.12081: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867690.12090: getting variables 30575 1726867690.12092: in VariableManager get_vars() 30575 1726867690.12141: Calling all_inventory to load vars for managed_node3 30575 1726867690.12143: Calling groups_inventory to load vars for managed_node3 30575 1726867690.12146: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867690.12156: Calling all_plugins_play to load vars for managed_node3 30575 1726867690.12158: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867690.12161: Calling groups_plugins_play to load vars for managed_node3 30575 1726867690.12986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867690.13860: done with get_vars() 30575 1726867690.13876: done getting variables 30575 1726867690.13922: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:28:10 -0400 (0:00:00.341) 0:02:05.517 ****** 30575 1726867690.13947: entering _queue_task() for managed_node3/shell 30575 1726867690.14181: worker is 1 (out of 1 available) 30575 1726867690.14195: exiting _queue_task() for managed_node3/shell 30575 1726867690.14209: done queuing things up, now waiting for results queue to drain 30575 1726867690.14211: waiting for pending results... 30575 1726867690.14396: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 30575 1726867690.14478: in run() - task 0affcac9-a3a5-e081-a588-0000000029a3 30575 1726867690.14492: variable 'ansible_search_path' from source: unknown 30575 1726867690.14496: variable 'ansible_search_path' from source: unknown 30575 1726867690.14525: calling self._execute() 30575 1726867690.14605: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867690.14609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867690.14620: variable 'omit' from source: magic vars 30575 1726867690.14895: variable 'ansible_distribution_major_version' from source: facts 30575 1726867690.14905: Evaluated conditional (ansible_distribution_major_version != '6'): True 30575 1726867690.15002: variable 'ansible_facts' from source: unknown 30575 1726867690.15608: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 30575 1726867690.15613: variable 'omit' from source: magic vars 30575 1726867690.15652: variable 'omit' from source: magic vars 30575 1726867690.15676: variable 'omit' from source: magic vars 30575 1726867690.15711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30575 1726867690.15741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30575 1726867690.15757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30575 1726867690.15770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867690.15780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30575 1726867690.15804: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30575 1726867690.15807: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867690.15810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867690.15888: Set connection var ansible_pipelining to False 30575 1726867690.15891: Set connection var ansible_shell_type to sh 30575 1726867690.15898: Set connection var ansible_shell_executable to /bin/sh 30575 1726867690.15903: Set connection var ansible_timeout to 10 30575 1726867690.15908: Set connection var ansible_module_compression to ZIP_DEFLATED 30575 1726867690.15914: Set connection var ansible_connection to ssh 30575 1726867690.15933: variable 'ansible_shell_executable' from source: unknown 30575 1726867690.15936: variable 'ansible_connection' from source: unknown 30575 1726867690.15938: variable 'ansible_module_compression' from source: unknown 30575 1726867690.15941: variable 'ansible_shell_type' from source: unknown 30575 1726867690.15943: variable 'ansible_shell_executable' from source: unknown 30575 1726867690.15945: variable 'ansible_host' from source: host vars for 'managed_node3' 30575 1726867690.15947: variable 'ansible_pipelining' from source: unknown 30575 1726867690.15949: variable 'ansible_timeout' from source: unknown 30575 1726867690.15954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30575 1726867690.16052: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867690.16063: variable 'omit' from source: magic vars 30575 1726867690.16070: starting attempt loop 30575 1726867690.16073: running the handler 30575 1726867690.16085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30575 1726867690.16098: _low_level_execute_command(): starting 30575 1726867690.16106: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30575 1726867690.16622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.16626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.16629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.16631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.16683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867690.16687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867690.16689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867690.16745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.18324: stdout chunk (state=3): >>>/root <<< 30575 1726867690.18424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867690.18447: stderr chunk (state=3): >>><<< 30575 1726867690.18452: stdout chunk (state=3): >>><<< 30575 1726867690.18474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867690.18487: _low_level_execute_command(): starting 30575 1726867690.18492: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922 `" && echo ansible-tmp-1726867690.1847332-35850-52818689854922="` echo /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922 `" ) && sleep 0' 30575 1726867690.18904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.18907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.18910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867690.18912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.18960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867690.18964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867690.19016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.20876: stdout chunk (state=3): >>>ansible-tmp-1726867690.1847332-35850-52818689854922=/root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922 <<< 30575 1726867690.20984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867690.21005: stderr chunk (state=3): >>><<< 30575 1726867690.21008: stdout chunk (state=3): >>><<< 30575 1726867690.21023: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867690.1847332-35850-52818689854922=/root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867690.21044: variable 'ansible_module_compression' from source: unknown 30575 1726867690.21085: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30575uphanqjn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30575 1726867690.21115: variable 'ansible_facts' from source: unknown 30575 1726867690.21171: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/AnsiballZ_command.py 30575 1726867690.21264: Sending initial data 30575 1726867690.21268: Sent initial data (155 bytes) 30575 1726867690.21676: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.21681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.21684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 30575 1726867690.21686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867690.21688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.21739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867690.21742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867690.21746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867690.21790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.23313: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30575 1726867690.23323: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30575 1726867690.23353: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30575 1726867690.23398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_a5a7yo3 /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/AnsiballZ_command.py <<< 30575 1726867690.23406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/AnsiballZ_command.py" <<< 30575 1726867690.23446: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30575uphanqjn/tmp_a5a7yo3" to remote "/root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/AnsiballZ_command.py" <<< 30575 1726867690.23450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/AnsiballZ_command.py" <<< 30575 1726867690.23981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867690.24013: stderr chunk (state=3): >>><<< 30575 1726867690.24016: stdout chunk (state=3): >>><<< 30575 1726867690.24038: done transferring module to remote 30575 1726867690.24046: _low_level_execute_command(): starting 30575 1726867690.24049: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/ /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/AnsiballZ_command.py && sleep 0' 30575 1726867690.24455: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.24464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.24466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867690.24469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867690.24471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.24513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867690.24517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867690.24565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.26282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867690.26303: stderr chunk (state=3): >>><<< 30575 1726867690.26306: stdout chunk (state=3): >>><<< 30575 1726867690.26320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867690.26324: _low_level_execute_command(): starting 30575 1726867690.26326: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/AnsiballZ_command.py && sleep 0' 30575 1726867690.26744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.26747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 30575 1726867690.26750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.26752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30575 1726867690.26754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 30575 1726867690.26756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.26805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867690.26812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867690.26814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867690.26858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.90198: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1441 0 --:--:-- --:--:-- --:--:-- 1445\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1442 0 --:--:-- --:--:-- --:--:-- 1440", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:28:10.418387", "end": "2024-09-20 17:28:10.899628", "delta": "0:00:00.481241", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30575 1726867690.91832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 30575 1726867690.91861: stderr chunk (state=3): >>><<< 30575 1726867690.91864: stdout chunk (state=3): >>><<< 30575 1726867690.91887: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1441 0 --:--:-- --:--:-- --:--:-- 1445\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1442 0 --:--:-- --:--:-- --:--:-- 1440", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:28:10.418387", "end": "2024-09-20 17:28:10.899628", "delta": "0:00:00.481241", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 30575 1726867690.91927: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30575 1726867690.91934: _low_level_execute_command(): starting 30575 1726867690.91940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867690.1847332-35850-52818689854922/ > /dev/null 2>&1 && sleep 0' 30575 1726867690.92375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.92414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30575 1726867690.92420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.92423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30575 1726867690.92425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30575 1726867690.92427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30575 1726867690.92475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 30575 1726867690.92482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30575 1726867690.92485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30575 1726867690.92528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30575 1726867690.94352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30575 1726867690.94378: stderr chunk (state=3): >>><<< 30575 1726867690.94382: stdout chunk (state=3): >>><<< 30575 1726867690.94394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30575 1726867690.94400: handler run complete 30575 1726867690.94420: Evaluated conditional (False): False 30575 1726867690.94428: attempt loop complete, returning result 30575 1726867690.94430: _execute() done 30575 1726867690.94432: dumping result to json 30575 1726867690.94438: done dumping result, returning 30575 1726867690.94446: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affcac9-a3a5-e081-a588-0000000029a3] 30575 1726867690.94451: sending task result for task 0affcac9-a3a5-e081-a588-0000000029a3 30575 1726867690.94552: done sending task result for task 0affcac9-a3a5-e081-a588-0000000029a3 30575 1726867690.94555: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.481241", "end": "2024-09-20 17:28:10.899628", "rc": 0, "start": "2024-09-20 17:28:10.418387" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1441 0 --:--:-- --:--:-- --:--:-- 1445 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1442 0 --:--:-- --:--:-- --:--:-- 1440 30575 1726867690.94622: no more pending results, returning what we have 30575 1726867690.94626: results queue empty 30575 1726867690.94627: checking for any_errors_fatal 30575 1726867690.94634: done checking for any_errors_fatal 30575 1726867690.94635: checking for max_fail_percentage 30575 1726867690.94637: done checking for max_fail_percentage 30575 1726867690.94638: checking to see if all hosts have failed and the running result is not ok 30575 1726867690.94639: done checking to see if all hosts have failed 30575 1726867690.94643: getting the remaining hosts for this loop 30575 1726867690.94645: done getting the remaining hosts for this loop 30575 1726867690.94648: getting the next task for host managed_node3 30575 1726867690.94659: done getting next task for host managed_node3 30575 1726867690.94661: ^ task is: TASK: meta (flush_handlers) 30575 1726867690.94663: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867690.94667: getting variables 30575 1726867690.94668: in VariableManager get_vars() 30575 1726867690.94723: Calling all_inventory to load vars for managed_node3 30575 1726867690.94726: Calling groups_inventory to load vars for managed_node3 30575 1726867690.94729: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867690.94739: Calling all_plugins_play to load vars for managed_node3 30575 1726867690.94742: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867690.94744: Calling groups_plugins_play to load vars for managed_node3 30575 1726867690.95710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867690.96574: done with get_vars() 30575 1726867690.96595: done getting variables 30575 1726867690.96651: in VariableManager get_vars() 30575 1726867690.96662: Calling all_inventory to load vars for managed_node3 30575 1726867690.96663: Calling groups_inventory to load vars for managed_node3 30575 1726867690.96665: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867690.96668: Calling all_plugins_play to load vars for managed_node3 30575 1726867690.96670: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867690.96671: Calling groups_plugins_play to load vars for managed_node3 30575 1726867690.97315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867690.98273: done with get_vars() 30575 1726867690.98294: done queuing things up, now waiting for results queue to drain 30575 1726867690.98295: results queue empty 30575 1726867690.98296: checking for any_errors_fatal 30575 1726867690.98299: done checking for any_errors_fatal 30575 1726867690.98299: checking for max_fail_percentage 30575 1726867690.98300: done checking for max_fail_percentage 30575 1726867690.98300: checking to see if all hosts have failed and the running result is not ok 30575 1726867690.98301: done checking to see if all hosts have failed 30575 1726867690.98301: getting the remaining hosts for this loop 30575 1726867690.98302: done getting the remaining hosts for this loop 30575 1726867690.98304: getting the next task for host managed_node3 30575 1726867690.98306: done getting next task for host managed_node3 30575 1726867690.98307: ^ task is: TASK: meta (flush_handlers) 30575 1726867690.98308: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867690.98310: getting variables 30575 1726867690.98310: in VariableManager get_vars() 30575 1726867690.98319: Calling all_inventory to load vars for managed_node3 30575 1726867690.98321: Calling groups_inventory to load vars for managed_node3 30575 1726867690.98322: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867690.98327: Calling all_plugins_play to load vars for managed_node3 30575 1726867690.98328: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867690.98330: Calling groups_plugins_play to load vars for managed_node3 30575 1726867690.98961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867690.99784: done with get_vars() 30575 1726867690.99797: done getting variables 30575 1726867690.99830: in VariableManager get_vars() 30575 1726867690.99837: Calling all_inventory to load vars for managed_node3 30575 1726867690.99839: Calling groups_inventory to load vars for managed_node3 30575 1726867690.99840: Calling all_plugins_inventory to load vars for managed_node3 30575 1726867690.99843: Calling all_plugins_play to load vars for managed_node3 30575 1726867690.99844: Calling groups_plugins_inventory to load vars for managed_node3 30575 1726867690.99846: Calling groups_plugins_play to load vars for managed_node3 30575 1726867691.00533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30575 1726867691.01380: done with get_vars() 30575 1726867691.01397: done queuing things up, now waiting for results queue to drain 30575 1726867691.01399: results queue empty 30575 1726867691.01399: checking for any_errors_fatal 30575 1726867691.01400: done checking for any_errors_fatal 30575 1726867691.01401: checking for max_fail_percentage 30575 1726867691.01401: done checking for max_fail_percentage 30575 1726867691.01402: checking to see if all hosts have failed and the running result is not ok 30575 1726867691.01402: done checking to see if all hosts have failed 30575 1726867691.01403: getting the remaining hosts for this loop 30575 1726867691.01403: done getting the remaining hosts for this loop 30575 1726867691.01405: getting the next task for host managed_node3 30575 1726867691.01407: done getting next task for host managed_node3 30575 1726867691.01407: ^ task is: None 30575 1726867691.01408: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30575 1726867691.01409: done queuing things up, now waiting for results queue to drain 30575 1726867691.01409: results queue empty 30575 1726867691.01410: checking for any_errors_fatal 30575 1726867691.01410: done checking for any_errors_fatal 30575 1726867691.01411: checking for max_fail_percentage 30575 1726867691.01412: done checking for max_fail_percentage 30575 1726867691.01412: checking to see if all hosts have failed and the running result is not ok 30575 1726867691.01412: done checking to see if all hosts have failed 30575 1726867691.01414: getting the next task for host managed_node3 30575 1726867691.01415: done getting next task for host managed_node3 30575 1726867691.01415: ^ task is: None 30575 1726867691.01416: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=334 changed=10 unreachable=0 failed=0 skipped=312 rescued=0 ignored=9 Friday 20 September 2024 17:28:11 -0400 (0:00:00.875) 0:02:06.392 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.87s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.77s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.76s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.76s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.71s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.71s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.22s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.21s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.97s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.95s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 30575 1726867691.01628: RUNNING CLEANUP